π
Random Variables and Their Applications
What is a Random Variable?
A random variable is a variable whose possible values are numerical outcomes of a random phenomenon. There are two main types of random variables:
- Discrete Random Variables: These can take on a countable number of distinct values. Examples include the number of heads in a series of coin flips or the number of students in a class.
- Continuous Random Variables: These can take on any value within a given range. Examples include the height of students in a class or the time it takes to run a marathon.
Uses of Random Variables in Daily Life
Random variables are used in various fields to model uncertainty and make predictions. Here are some examples:
- Finance: Modeling stock prices and returns on investments.
- Engineering: Assessing the reliability of systems and components.
- Medicine: Evaluating the effectiveness of treatments and drugs.
- Retail: Predicting the number of customers or sales in a given period.
Difference Between Discrete and Continuous Random Variables
The main difference between discrete and continuous random variables is the type of values they can take:
- Discrete Random Variables: Can take specific, countable values (e.g., 0, 1, 2, 3).
- Continuous Random Variables: Can take any value within a range (e.g., 1.5, 2.3, 3.7).
Discrete Random Variables
A discrete random variable is one that can take on a finite or countably infinite number of values. Examples include the number of defective items in a batch or the number of goals scored in a match.
Probability Mass Function (PMF)
The probability mass function (PMF) of a discrete random variable gives the probability that the variable takes on a specific value. The PMF must satisfy two conditions:
- All probabilities are non-negative.
- The sum of all probabilities is 1.
Skewed Distributions Histogram
Symmetric Distribution
Positively Skewed Distribution
Negatively Skewed Distribution
In case of symmetric(no skew) mode=median=mean.
In case of positive skewed mode<median<mean
In case of negatively skewed mode>median>mean
Cumulative Distribution Function (CDF)
The cumulative distribution function (CDF) of a random variable gives the probability that the variable takes on a value less than or equal to a specific value. The CDF is different from the PMF in that it accumulates probabilities up to a certain point.
Example of PMF and CDF
Consider a discrete random variable X that represents the number of heads in two coin flips. The PMF and CDF can be represented as follows:
X | PMF (P(X=x)) | CDF (P(X≤x)) |
---|---|---|
0 | 0.25 | 0.25 |
1 | 0.50 | 0.75 |
2 | 0.25 | 1.00 |
Comments
Post a Comment