Review of Final Part I Sections 2.2 --

Review of Final Part I Sections 2.2 -- 4.5 Jiaping Wang Department of Mathematics 02/29/2013, Monday The UNIVERSITY of NORTH CAROLINA at CHAPEL Outline Sample Space and Events Definition of Probability

Counting Rules Conditional Probability and Independence Probability Distribution and Expected Values Bernoulli, Binomial and Geometric Distributions Negative Binomial, Poisson, Hypergeometric Distributions and MGF The UNIVERSITY of NORTH CAROLINA at CHAPEL Part 1. Sample Space and Events The UNIVERSITY of NORTH CAROLINA at CHAPEL

Definition 2.1 A sample space S is a set that includes all possible outcomes for a random experiment listed in a mutually exclusive and exhaustive way. Mutually Exclusive means the outcomes of the set do not overlap. Exhaustive means the list contains all possible outcomes. Definition 2.2: An event is any subset of a sample space.

The UNIVERSITY of NORTH CAROLINA at CHAPEL Event Operators and Venn Diagram There are three operators between events: Intersection: --- AB or AB a new event consisting of common elements from A and B Union: U --- AUB a new event consisting of all outcomes from A or B. Complement: , A, -- a subset of all outcomes in S that are not in A. S AUB

S AB S A The UNIVERSITY of NORTH CAROLINA at CHAPEL Some Laws Commutative laws: Associate laws: Distributive laws:

DeMorgans laws: The UNIVERSITY of NORTH CAROLINA at CHAPEL Part 2. Definition of Probability The UNIVERSITY of NORTH CAROLINA at CHAPEL Suppose that a random experiment has associated with a sample space S. A probability is a numerically valued function that assigned a number P(A) to every event A so

that the following axioms hold: (1) P(A) 0 (2) P(S) = 1 (3) If A1, A2, is a sequence of mutually exclusive events (that is AiAj= for any ij), then The UNIVERSITY of NORTH CAROLINA at CHAPEL Some Basic Properties 1. P( ) = 0, P(S) = 1. 2. 0 P(A) 1for any event A. 3. P(AUB) = P(A) + P(B) if A and B are mutually

exclusively. 4. P(AUB) = P(A) + P(B) P(AB) for general events A and B. 5. If A is a subset of B, then P(A) P(B). 6. P(A) = 1 P(A). 7. P(AB) = P(A) P(AB). The UNIVERSITY of NORTH CAROLINA at CHAPEL Inclusive-Exclusive Principle Theorem 2.1. For events A1, A2, , An from the sample space S,

We can use induction to prove this. The UNIVERSITY of NORTH CAROLINA at CHAPEL Determine the Probability Values The definition of probability only tells us the axioms that the probability function must obey; it doesnt tell us what values to assign to specific event. The value of the probability is usually based on empirical evidence or on careful thought about the experiment.

For example, if a die is balanced, then we may think P(Ai)=1/6 for Ai={ i }, i = 1, 2, 3, 4, 5, 6 However, if a die is not balanced, to determine the probability, we need run lots of experiments to find the frequencies for each outcome. The UNIVERSITY of NORTH CAROLINA at CHAPEL Part 3. Counting Rules The UNIVERSITY of NORTH CAROLINA at CHAPEL Theorem 2.2

Fundamental Principle of Counting: If the first task of an experiment can result in n1 possible outcomes and for each such outcome, the second task can result in n2 possible outcomes, then there are n1n2 possible outcomes for the two tasks together. The principle can extend to more tasks in a sequence. The UNIVERSITY of NORTH CAROLINA at CHAPEL Order and Replacement Order Is Important

Order Is Not Important With Replacement nr Crn+r-1 Without Replacement Pr n

C rn The UNIVERSITY of NORTH CAROLINA at CHAPEL Theorem 2.5 Partitions Consider a case: If we roll a die for 12 times, how many possible ways to have 2 1s, 2 2s, 3 3s, 2 4s, 2 5s and 1 6s? Solution: First, choose 2 1s from 12 which gives 12!/(2!10!), second, since ther two positions are filled by 1s, the next choice appears in the left 10 positions, s there are 10!/(8!2!) ways, and so similar for next other selections which provide

final result is 12!/(2!10!)x10!/(2!8!)x8!/(3!5!)x5!/(2!3!)x3!/(2!1!)x1!/(1!0!) =12!/(2!x2!x3!x2!x2!x1!) Theorem 2.5 Partitions. The number of partitioning n distinct objects into k groups containing n1, n2,, nk objects, respectively, is The UNIVERSITY of NORTH CAROLINA at CHAPEL Part 4. Conditional Probability and Independence

The UNIVERSITY of NORTH CAROLINA at CHAPEL Definition 3.1 If A and B are any two events, then the conditional probability of A given B, denoted as P(A|B), is Provided that P(B)>0. Notice that P(AB) = P(A|B)P(B) or P(AB) = P(B|A)P(A). This definition also follows the three axioms of probability. (1) AB is a subset of B, so P(AB )P(B), then 0P(A|B)1; (2) P(S|B)=P(SB)/P(B)=P(B)/P(B)=1; (3) If A1, A2, , are mutually exclusively, then so are A1B, A2 B, ; and P(UAi|B) = P((UAi) B)/P(B)=P(U(Ai B)/P(B)=P(Ai B)/P(B)= P(Ai|B).

The UNIVERSITY of NORTH CAROLINA at CHAPEL Definition 3.2 and Theorem 3.2 Definition 3.2: Two events A and B are said to be independent if P(AB)=P(A)P(B). This is equivalent to stating that P(A|B)=P(A), P(B|A)=P(B) If the conditional probability exist. Theorem 3.2: Multiplicative Rule. If A and B are any two events, then

P(AB) = P(A)P(B|A) = P(B)P(A|B) If A and B are independent, then P(AB) = P(A)P(B). The UNIVERSITY of NORTH CAROLINA at CHAPEL Theorem of Total Probability: If B1, B2, , Bk is a collection of mutually exclusive and exhaustive events, then for any event A, we have Bayes Rule. If the events B1, B2, , Bk form a partition of the sample space S, and A is any event in S, then

The UNIVERSITY of NORTH CAROLINA at CHAPEL Part 5. Probability Distribution and Expected Value The UNIVERSITY of NORTH CAROLINA at CHAPEL A random variable is a real-valued function whose domain is a sample space. A random variable X is said to be discrete if it can take on only a finite number or a countably infinite number

of possible values x. The probability function of X, denoted by p(x), assigns probability to each value x of X so that the following conditions hold: 1. P(X=x)=p(x)0; 2. P(X=x) =1, where the sum is over all possible values of x. The UNIVERSITY of NORTH CAROLINA at CHAPEL The distribution function F(b) for a random variable X is F(b)=P(X b); If X is discrete, Where p(x) is the probability function.

The distribution function is often called the cumulative distribution function (CDF). Any function satisfies the following 4 properties is a distribution function: 1. 2. 3. The distribution function is a non-decreasing function: if a

The UNIVERSITY of NORTH CAROLINA at CHAPEL Definition 4.4 Definition 4.4 The expected value of a discrete random variable X with probability distribution p(x) is given as (The sum is over all values of x for which p(x)>0) We sometimes use the notation E(X)= for this equivalence. Note: Not all expected values exist, the sum above must converge absolutely, |x|p(x)<.

Theorem 4.1 If X is a discrete random variable with probability p(x) and if g(x) is any real-valued function of X, then E(g(x))=g(x)p(x). The UNIVERSITY of NORTH CAROLINA at CHAPEL Definitions 4.5 and 4.6 The variance of a random variable X with expected value is given by V(X)=E[(X- )2] Sometimes we use the notation 2 = E[(X- )2]

For this equivalence. The standard deviation is a measure of variation that maintains the original units of measure. The standard deviation of a random variable is the square root of the variance and is given by The UNIVERSITY of NORTH CAROLINA at CHAPEL Theorem 4.2 For any random variable X and constants a and b. 1. E(aX + b) = aE(X) + b

2. V(aX + b) = a2V(X) Standardized random variable: If X has mean and standard deviation , then Y=(X )/ has E(Y)=0 and V(Y)=1, thus Y can be called the standardized random variable of X. Theorem 4.3 If X is a random variable with mean , then V(X)= E(X2) 2 Tchebysheffs Theorem. Let X be a random variable with mean and standard deviation . Then for any positive k, P(|X |/ < k) 1-1/k2 The UNIVERSITY of NORTH CAROLINA at CHAPEL

Part 6. Bernoulli, Binomial and Geometric Distribution The UNIVERSITY of NORTH CAROLINA at CHAPEL Bernoulli Distribution Let the probability of success is p, then the probability of failure is 1-p, the distribution of X is given by p(x)=px(1-p)1-x, x=0 or 1 Where p(x) denotes the probability that X=x.

E(X) = xp(x) = 0p(0)+1p(1)=0(1-p)+p= p E(X)=p V(X)=E(X2)-E2(X)= x2p(x) p2=0(1-p)+1(p)-p2=p-p2=p(1-p) V(X)=p(1-p) The UNIVERSITY of NORTH CAROLINA at CHAPEL Binomial Distribution Suppose we conduct n independent Bernoulli trials, each with a probability p of success. Let the random variable X be the number of successes in these n trials. The distribution of X is called binomial distribution. Let Yi = 1 if ith trial is a success

= 0 if ith trial is a failure, Then X= Yi denotes the number of the successes in the n independent trials. So X can be {0, 1, 2, 3, , n}. For example, when n=3, the probability of success is p, then what is the probability of X? The UNIVERSITY of NORTH CAROLINA at CHAPEL Cont. The mass function of binomial distribution: From the binomial formula, we can have

= A random variable X is a binomial distribution if 1. The experiment consists of a fixed number n of identical trials. 2. Each trial only have two possible outcomes, that is the Bernoulli trials. 3. The probability p is constant from trial to trial. 4. The trials are independent. 5. X is the number of successes in n trails. The UNIVERSITY of NORTH CAROLINA at CHAPEL

E(X)=np Bernoulli random variables Y1, Y2, , Yn, then V(X)=np(1-p) Bernoulli random variables Y1, Y2, , Yn, then V The UNIVERSITY of NORTH CAROLINA at CHAPEL Geometric Distribution: Probability Function The geometric distribution function: P(X=x)=p(x)=(1-p)xp=qxp, x= 0, 1, 2, ., q=1-p

P(X=x) = qxp = p[qx-1p] = qP(X=x-1)

Sum of partial series: = Then we can verify The cumulative distribution function: F(x)=P(Xx)===1-qx+1 And P(Xx)=1-F(x-1)=qx The UNIVERSITY of NORTH CAROLINA at CHAPEL Mean and Variance The Expected Value E(X)= The Variance V(X)=

E(X)= So E(X)/(pq) = And E(X)/p = [0 + q + 2q2 + ] Thus, E(X)/(pq)-E(X)/p = 1+q+q2+q3+ = 1/(1-q) E(X)= The UNIVERSITY of NORTH CAROLINA at CHAPEL Part 7. Negative Binomial, Poisson, Hypergeometric Distributions and MGF

The UNIVERSITY of NORTH CAROLINA at CHAPEL Negative Binomial Distribution What if we were interested in the number of failures prior to the second success, or the third success or (in general) the r-th success? Let X denote the number of failures prior to the r-th success, p denotes the common probability. The negative binomial distribution function: P(X=x)=p(x)=, x= 0, 1, 2, ., q=1-p If r=1, then the negative binomial distribution becomes the geometric distribution.

In summary, The UNIVERSITY of NORTH CAROLINA at CHAPEL Poisson Distribution The Poisson probability function: P(X=x)=p(x)=, x= 0, 1, 2, ., for > 0 The distribution function is F(x)=P(Xx)= Recall that denotes the mean number of occurrences in one time period, if there are t non-overlapped time periods, then the mean would be t. Poisson distribution

is often referred to as the distribution of rare events. E(X)= V(X) = for Poisson random variable. The UNIVERSITY of NORTH CAROLINA at CHAPEL Hypergeometric Distribution Now we consider a general case: Suppose a lot consists of N items, of which k are of one type (called successes) and N-k are of another type (called failures). Now n items are sampled randomly and

sequentially without replacement. Let X denote the number of successes among the n sampled items. So What is P(X=x) for some integer x? The probability function is: P(X=x) = p(x) = Which is called hypergeometric probability distribution. The UNIVERSITY of NORTH CAROLINA at CHAPEL Moment Generating Function The k-th moment is defined as E(Xk)=xkp(x). For example, E(X) is the 1st moment, E(X2) is the 2nd moment.

The moment generating function is defined as M(t)=E(etX) So we have M(k)(0)=E(Xk). For example, So if set t=0, then M(1)(0)=E(X). It often is easier to evaluate M(t) and its derivatives than to find the moments of the random variable directly. The UNIVERSITY of NORTH CAROLINA at CHAPEL

Recently Viewed Presentations

  • Machine Design Analyis - University of Washington

    Machine Design Analyis - University of Washington

    Machine Design Analyis Author: jiangyu Last modified by: Jiangyu Li ... Condition Deflection by Bending Moment Equation Deflection by Loading Equation Deflection by Superposition Strain Energy of Pure Bending Strain Energy of Bending Strain Energy of a Beam in Shear...
  • www.alislam.org

    www.alislam.org

    Competitive spirit is channelised by the injunction to excel others in goodness. 2:149 And everyone has something that takes their whole attention, then vie with each other in goodness. Wherever you maybe, Allah will bring you together. Surely Allah has...
  • ME 481 Engineering Modeling Prof. Clark Radcliffe Mechanical

    ME 481 Engineering Modeling Prof. Clark Radcliffe Mechanical

    ME 481 Engineering Modeling Prof. Clark Radcliffe Mechanical Engineering * * * * Why do Modeling? Design is a Problem Solving Process Key to the Process is Problem Definition Design Problem Definition states Specifications of Desired Performance Models Predict Performance...
  • SP Retreat Presentation Slideshow

    SP Retreat Presentation Slideshow

    (e.g., ADP) should include pay records from the outsourced company (e.g., labor distribution and payroll summary) for each reported employee, as well as the total payroll summary for the organization.
  • Final Exam Wednesday (6/9) - 8:00  9:30 Period

    Final Exam Wednesday (6/9) - 8:00 9:30 Period

    Provide and explain 2 examples of dramatic irony in The Pardoner's Tale. Reader knows rioters' plan to ambush Reader knows poison is in the rioters' drinks 4. Provide and explain 2 examples of personification in The Pardoner's Tale Fortune (capitalized;...
  • Managerial Accounting: An Introduction To Concepts, Methods ...
  • Economic Development in a Developed Economy - France.

    Economic Development in a Developed Economy - France.

    Tourism and its spin-off / multiplier effect is of great importance to the economy of the Paris Basin and attractions include: Louvre, Versailles, Eiffel Tower, Disneyland Resort (Marne la Vallee) etc. As the economy improves there is a greater demand...
  • Unit 4: Genetic Selection &amp; Mating

    Unit 4: Genetic Selection & Mating

    Figure 13.1 Variation or difference in weaning weight in beef cattle. The variation shown by the bell-shaped curve could be representative of a breed or a large herd. The dark vertical line in the center is the average or the...