1. Sample Space and Events: Rolling two fair six-sided dice. A = sum is 7, B = first die is 4.

(a) The sample space consists of all possible pairs of outcomes where is the result of the first die and is the result of the second die. The total number of outcomes is .

(b) Event A (sum is 7): The outcomes are pairs such that . . The number of outcomes in A is . Since the dice are fair, all outcomes in are equally likely. .

(c) Event B (first die shows a 4): The outcomes are pairs such that . . The number of outcomes in B is . .

(d) is the event where the sum is 7 AND the first die is 4. The only outcome satisfying both conditions is . . . is the probability that the sum is 7 OR the first die is 4. We use the formula . .


2. Axioms and Conditional Probability: Box: 5 Red (R), 3 Blue (B). Total = 8. Two balls drawn without replacement.

(a) Let be the event the first ball is red, and be the event the second ball is blue. We want . If the first ball drawn was red, there are 7 balls remaining: 4 Red and 3 Blue. The probability that the second ball drawn is blue is .

(b) Let be the event the first ball is red, and be the event the second ball is red. We want . Using the multiplication rule for conditional probability: . . Given the first was red, there are 4 red balls left out of 7 total. So, . .

(c) Let be the event “first ball red” and be the event “first ball blue”. This refers to the sample space of the first draw. Let this sample space be . and .

  • Axiom 1 (Non-negativity): and . Satisfied.
  • Axiom 2 (Normalization): The sample space for the first draw is . (since and are mutually exclusive). . Satisfied.
  • Axiom 3 (Additivity): For mutually exclusive events (like and in ), the probability of their union is the sum of their probabilities. This was used in verifying Axiom 2. Satisfied.

3. Law of Total Probability: Urn 1 (U1): 2W, 3B (Total 5). Urn 2 (U2): 4W, 1B (Total 5). An urn is chosen at random, . Let W be the event that the drawn ball is white. We need . Using the Law of Total Probability: . From Urn 1, . From Urn 2, . .


4. Bayes’ Theorem: Using the setup from Problem 3. A white ball (W) was drawn. What is the probability it came from Urn 1 ()? We need . Using Bayes’ Theorem: . From Problem 3, we have: .


5. Independent Events: Given , , . Events A and B are independent if . First, find using the inclusion-exclusion principle: . Now check the condition for independence: . Since is equal to , the events A and B are independent.


6. Binomial Distribution: Coin flipped times, . = number of heads. .

(a) Probability of getting exactly heads: . .

(b) Expected number of heads: .

(c) Variance of the number of heads: .


7. Geometric Distribution: Free throws success probability . = number of attempts needed for the first success. .

(a) Probability that the first success occurs on the rd attempt: .

(b) Expected number of attempts needed: .


8. Poisson Distribution: Average rate customers per hour. Let be the number of arrivals in a given hour. .

(a) Probability that exactly customers arrive: . (Approximate value: )

(b) Probability that at least one customer arrives: . . . (Approximate value: )

(c) Mean and variance of the number of arrivals per hour: For a Poisson distribution, the mean and variance are both equal to the rate parameter . . .


9. Uniform Distribution: is uniformly distributed on .

(a) Probability Density Function (PDF):

(b) Cumulative Distribution Function (CDF): .

(c) Calculate : Using the CDF: . Alternatively, using the PDF: .

(d) Calculate and : . .


10. Exponential Distribution: Lifetime follows an exponential distribution with parameter . .

(a) Probability Density Function (PDF):

(b) Probability that the component lasts more than 5 years, : (Survival function) .

(c) Expected lifetime : years.

(d) Using the memoryless property: . The memoryless property states . Here and . . .


11. Normal Distribution: . This means the standard deviation is .

(a) Standardize the random variable : . follows the standard normal distribution, .

(b) Express in terms of the standard normal CDF : First, standardize the bounds: Lower bound: . Upper bound: . . Using the CDF : .

(c) Calculate and : Using properties of expectation and variance: . .


12. Expected Value and Variance: Discrete random variable with PMF: .

(a) Calculate : .

(b) Calculate : .

(c) Calculate using : .

(d) Calculate and : . .


13. Bivariate Discrete Distribution: Joint PMF given in the table.

(a) Find the marginal PMFs and : The marginal PMF is found by summing the joint probabilities over for each . . . Check: . The marginal PMF is found by summing the joint probabilities over for each . . . Check: . Completed table:

y=0y=1
x = 00.10.20.3
x = 10.30.40.7
0.40.61

(b) Find the conditional PMF : . We need , and . . . The conditional PMF for given is and .

(c) Calculate : .

(d) Calculate : . First find and using the marginals: . . .

(e) Are X and Y independent? Check if for all . Let’s check : . . Since , and are not independent.


14. Bivariate Continuous Distribution: if , and 0 otherwise.

(a) Find the value of : The total integral of the PDF must be 1. . So, . The joint PDF is for .

(b) Find the marginal PDF : Integrate the joint PDF over . For : . So, for , and 0 otherwise.

(c) Find the conditional PDF : for . For and : . The conditional PDF is for , and 0 otherwise.

(d) Are X and Y independent? Check if . By symmetry, the marginal PDF for Y is for . . The joint PDF is . Since generally, and are not independent.


15. Independence Check: Joint PDF for .

Find the marginal PDFs: For : . So, for . (X follows Exp(1)). For : . So, for . (Y follows Exp(1)).

Check independence: . This is equal to the joint PDF . Also, the support region is a product space. Therefore, and are independent.


16. Covariance and Correlation: Given .

Calculate Covariance: .

Calculate Correlation Coefficient : . We need standard deviations: . . .

Are X and Y uncorrelated? Two variables are uncorrelated if their covariance is 0. Since , and are not uncorrelated (they are correlated).


17. Conditional Expectation: Using the joint PMF from Problem 13. Calculate . . From Problem 13a, . .


18. Law of Total Expectation: Given and . Calculate . By the Law of Total Expectation (also known as the Tower Property): . Substitute the given conditional expectation: . Using linearity of expectation: . Substitute : .


19. Multivariate Normal Distribution: with and .

(a) What are , and ? These are read directly from the mean vector and covariance matrix . . . . . .

(b) Are and independent? Why or why not? For jointly normally distributed variables, independence is equivalent to having zero covariance. Since , and are not independent.

(c) What is the distribution of the linear combination ? A linear combination of jointly normal random variables is also normally distributed. We need to find its mean and variance. Mean: . Variance: Using the formula with : . Therefore, follows a normal distribution with mean 5 and variance 44. .


20. Multinomial Distribution: Rolling a fair six-sided die times. = number of times ‘1’ appears. = number of times ‘2’ or ‘3’ appears. = number of times ‘4’, ‘5’, or ‘6’ appears.

(a) What are the parameters and the probabilities ? (number of trials/rolls). (number of categories). The probabilities for each category on a single roll are: . . . Check: .

(b) Calculate the probability of observing the outcome . Check that . The multinomial probability formula is: . . Calculate the multinomial coefficient: . Calculate the probability part: . . Simplifying the fraction (dividing numerator and denominator by 72): . (Approximate value: )