目录

  • The Bernoulli Random Variable (伯努利随机变量)
  • The Binomial Random Variable
  • The Geometric Random Variable
  • The Poisson Random Variable
  • Functions of Random Variables
  • References
  • The most important way to characterize a random variable is through the probabilities of the values that it can take. For a discrete random variable XXX, these are captured by the probability mass function (PMF for short) of XXX, denoted pXp_XpX​. In particular, for any real number xxx, the probability mass of xxx. denoted pX(x)p_X(x)pX​(x). is the probability of the event {X=x}\{X = x\}{X=x}. Thus, from the additivity and normalization axioms, we have
    ∑xpX(x)=1\sum_{x}p_X(x)=1x∑​pX​(x)=1

In what follows, we will often omit the braces from the event/set notation when no ambiguity can arise. In particular, we will usually write P(X=x)P(X = x)P(X=x) in place of the more correct notation P({X=x})P(\{X = x\})P({X=x}).

We will use upper case characters to denote random variables, and lower case characters to denote real numbers such as the numerical values of a random variable.

The Bernoulli Random Variable (伯努利随机变量)

  • Consider the toss of a coin, which comes up a head with probability ppp, and a tail with probability 1−p1 - p1−p. The Bernoulli random variable takes the two values 1 and 0, depending on whether the outcome is a head or a tail:

    Its PMF is

The Binomial Random Variable

  • A coin is tossed nnn times. At each toss, the coin comes up a head with probability ppp, and a tail with probability 1−p1 - p1−p, independent of prior tosses. Let XXX be the number of heads in the nnn-toss sequence. We refer to XXX as a binomial random variable with parameters nnn and ppp. The PMF of XXX consists of the binomial probabilities:
    pX(k)=P(X=k)=(nk)pk(1−p)n−k,k=0,1,...,n.p_X(k) = P(X = k) =\begin{pmatrix}n\\k \end{pmatrix}p^k(1-p)^{n-k},k = 0, 1, ... , n.pX​(k)=P(X=k)=(nk​)pk(1−p)n−k,k=0,1,...,n.The normalization property, specialized to the binomial random variable, is written as
    ∑k=0n(nk)pk(1−p)n−k=1\sum_{k=0}^n\begin{pmatrix}n\\k \end{pmatrix}p^k(1-p)^{n-k}=1k=0∑n​(nk​)pk(1−p)n−k=1

Form of the binomial PMF.

  • Let k∗=⌊(n+1)p⌋k^*=\lfloor (n + 1)p\rfloork∗=⌊(n+1)p⌋. The PMF pX(k)p_X(k)pX​(k) is monotonically nondecreasing with kkk in the range from 000 to k∗k^*k∗. and is monotonically decreasing with kkk for k≥k∗k\geq k^*k≥k∗.
    (pX(k)pX(k−1)=(n+1)p−kpk−kp)(\frac{p_X(k)}{p_X(k-1)}=\frac{(n+1)p-kp}{k-kp})(pX​(k−1)pX​(k)​=k−kp(n+1)p−kp​)

Problem 6.

The Celtics and the Lakers are set to play a playoff series of nnn basketball games, where nnn is odd. The Celtics have a probability ppp of winning any one game, independent of other games. For any k>0k > 0k>0, find the values for ppp for which n=2k+1n = 2k + 1n=2k+1 is better for the Celtics than n=2k−1n = 2k -1n=2k−1.

SOLUTION

  • Let NNN be the number of Celtics’ wins in the first 2k−12k -12k−1 games. If AAA denotes the event that the Celtics win with n=2k+1n = 2k +1n=2k+1, and BBB denotes the event that the Celtics win with n=2k−1n = 2k-1n=2k−1, then
    P(A)=P(N≥k+1)+P(N=k)⋅(1−(1−p)2)+P(N=k−1)⋅p2P(B)=P(N≥k)=P(N=k)+P(N≥k+1)P(A)=P(N\geq k+1)+P(N=k)\cdot(1-(1-p)^2)+P(N=k-1)\cdot p^2\\ P(B)=P(N\geq k)=P(N=k)+P(N\geq k+1)P(A)=P(N≥k+1)+P(N=k)⋅(1−(1−p)2)+P(N=k−1)⋅p2P(B)=P(N≥k)=P(N=k)+P(N≥k+1)and therefore
    P(A)−P(B)=P(N=k−1)⋅p2−P(N=k)⋅(1−p)2=(2k−1)!(k−1)!k!pk(1−p)k(2p−1)\begin{aligned}P(A)-P(B)&=P(N=k-1)\cdot p^2-P(N=k)\cdot (1-p)^2\\&=\frac{(2k-1)!}{(k-1)!k!}p^k(1-p)^k(2p-1)\end{aligned}P(A)−P(B)​=P(N=k−1)⋅p2−P(N=k)⋅(1−p)2=(k−1)!k!(2k−1)!​pk(1−p)k(2p−1)​It follows that P(A)>P(B)P(A) > P(B)P(A)>P(B) if and only if p>12p > \frac{1}{2}p>21​. Thus, a longer series is better for the better team.

The Geometric Random Variable

几何随机变量

  • Suppose that we repeatedly and independently toss a coin with probability of a head equal to ppp, where 0<p<10 < p < 10<p<1. The geometric random variable is the number XXX of tosses needed for a head to come up for the first time. Its PMF is given by
    pX(k)=(1−p)k−1p,k=1,2,...,p_X(k)=(1-p)^{k-1}p,k=1,2,...,pX​(k)=(1−p)k−1p,k=1,2,...,
  • More generally, we can interpret the geometric random variable in terms of repeated independent trials until the first “success.”

The Poisson Random Variable

泊松随机变量

  • A Poisson random variable has a PMF given by
    pX(k)=e−λλkk!,k=0,1,2,...,p_X(k)=e^{-\lambda}\frac{\lambda^k}{k!},\ \ \ \ \ \ \ \ k=0,1,2,...,pX​(k)=e−λk!λk​,        k=0,1,2,...,where λ\lambdaλ is a positive parameter characterizing the PMF. This is a legitimate PMF because
    ∑k=0∞e−λλkk!=e−λeλ=1\sum_{k=0}^\infty e^{-\lambda}\frac{\lambda^k}{k!}=e^{-\lambda}e^{\lambda}=1k=0∑∞​e−λk!λk​=e−λeλ=1

Form of the Poisson PMF.

  • The PMF pX(k)p_X(k)pX​(k) increases monotonically with kkk up to the point where kkk reaches the largest integer not exceeding λ\lambdaλ, and after that point decreases monotonically with kkk.
    (pX(k)pX(k−1)=λk)(\frac{p_X(k)}{p_X(k-1)}=\frac{\lambda}{k})(pX​(k−1)pX​(k)​=kλ​)

Poisson approximation property

  • The Poisson PMF with parameter λ\lambdaλ is a good approximation for a binomial PMF with parameters nnn and ppp. i.e …
    e−λλkk!≈n!k!(n−k)!pk(1−p)n−k,ifk≪ne^{-\lambda}\frac{\lambda^k}{k!}\approx \frac{n!}{k!(n-k)!}p^k(1-p)^{n-k},\ \ \ \ \ \ \ \ if\ k\ll ne−λk!λk​≈k!(n−k)!n!​pk(1−p)n−k,        if k≪nprovided λ=np\boldsymbol{\lambda= np}λ=np. nnn is very large, and ppp is very small. In this case. using the Poisson PMF may result in simpler models and calculations.

    • For example. let n=100n = 100n=100 and p=0.01p = 0.01p=0.01. Then the probability of k=5k = 5k=5 successes in n=100n = 100n=100 trials is calculated using the binomial PMF as 0.002900.002900.00290. Using the Poisson PMF with λ=np=100⋅0.01=1\lambda= np = 100\cdot0.01 = 1λ=np=100⋅0.01=1. this probability is approximated by 0.003060.003060.00306.
  • Proof: Consider the PMF of a binomial random variable with parameters n→∞n\rightarrow\inftyn→∞ and p→0p\rightarrow0p→0 while npnpnp is fixed at a given value λ\lambdaλ
    pX(k)=n!(n−k)!k!pk(1−p)n−k=n(n−1)...(n−k+1)nkλkk!(1−λn)n−kn−k+jn→1,(1−λn)k→1,(1−λn)n→e−λp_X(k)=\frac{n!}{(n-k)!k!}p^k(1-p)^{n-k}=\frac{n(n-1)...(n-k+1)}{n^k}\frac{\lambda^k}{k!}(1-\frac{\lambda}{n})^{n-k}\\ \frac{n-k+j}{n}\rightarrow1,(1-\frac{\lambda}{n})^{k}\rightarrow1,(1-\frac{\lambda}{n})^{n}\rightarrow e^{-\lambda}pX​(k)=(n−k)!k!n!​pk(1−p)n−k=nkn(n−1)...(n−k+1)​k!λk​(1−nλ​)n−knn−k+j​→1,(1−nλ​)k→1,(1−nλ​)n→e−λThus, for each fixed kkk, as n→∞n\rightarrow\inftyn→∞ we obtain
    pX(k)→e−λλkk!p_X(k)\rightarrow e^{-\lambda}\frac{\lambda^k}{k!}pX​(k)→e−λk!λk​

Functions of Random Variables

  • Given a random variable XXX, one may generate other random variables by applying various transformations on XXX. If Y=g(X)Y = g(X)Y=g(X) is a function of a random variable XXX, then YYY is also a random variable, since it provides a numerical value for each possible outcome.
  • If XXX is discrete with PMF pXp_XpX​. then YYY is also discrete, and its PMF pYp_YpY​ can be calculated using the PMF of XXX.
    pY(y)=∑{x∣g(x)=y}pX(x)p_Y(y)=\sum_{\{x|g(x)=y\}}p_X(x)pY​(y)={x∣g(x)=y}∑​pX​(x)

References

  • IntroductionIntroductionIntroduction tototo ProbabilityProbabilityProbability

Chapter 2 (Discrete Random Variables): Probability mass functions (PMF 分布列)相关推荐

  1. 离散型随机变量-Discrete Random Variables

    这里主要讲了离散型随机变量的概率质量函数以及累计分布函数 随机变量(random variables, rv)有两大类:离散型(discrete)和连续型(continuous),对于前者,表示变量取 ...

  2. Statistical Inference-Univariate Random Variables

    1.1 Discrete Random Variables A random variable is discrete if it takes values in some countable set ...

  3. 高等概率论 Chapter 5. Random Variables on a Countable Space

    Chapter 5 Random Variables on a Countable Space 南京审计大学统计学研究生第一学期课程,<高等概率论>. 欢迎大家来我的github下载源码呀 ...

  4. 概率质量函数(Probability mass function)

    在概率和统计中,概率质量函数(Probability mass function)是给出离散随机变量恰好等于某个值的概率的函数.有时也称为离散密度函数(discrete density functio ...

  5. 高等概率论 Chapter 6 Construction of a Probability Measure

    Chapter 6 Construction of a Probability Measure 南京审计大学统计学研究生第一学期课程,<高等概率论>. 欢迎大家来我的github下载源码呀 ...

  6. Lecture 12: Iterated Expectations; Sum of a Random Number of Random Variables

    前言:本节课讲了条件期望以及他的应用, 任意数量的独立随机变量的加和以及它的期望和方差. E[X∣Y=y]=∑xxpX∣Y(x∣y)E[X|Y = y] = \sum_x xp_{X|Y}(x|y) ...

  7. 概率质量函数(Probability Mass Function)和期望课程笔记

    随机变量的数学定义 从样本空间到实数值的映射函数. 一个样本空间可以定义多个随机变量 一个或几个随机变量的函数构成一个新的随机变量 概率质量函数的定义 pX(x)=P(X=x)=P({ω∈Ω s.t. ...

  8. 用卷积公式计算“独立随机变量之和”的概率分布 Distribution of sum of independent Rayleigh random variables

    https://math.stackexchange.com/questions/1019375/distribution-of-sum-of-independent-rayleigh-random- ...

  9. 离散随机变量和连续随机变量_随机变量深度崩溃课程

    离散随机变量和连续随机变量 表中的内容 (Table of Content) Random Variables随机变量 Probability Distribution Functions概率分布函数 ...

  10. 机器学习中用到的概率知识_学习机器学习前,你首先要掌握这些概率论基础知识...

    编者按:本文来自微信公众号"将门创投"(ID:thejiangmen),编译:Tom R,36氪经授权转发. 机器学习中有很多十分重要的核心基础概念,掌握这些概念对我们进行机器学习 ...

最新文章

  1. Go 学习笔记(41)— Go 标准库之 encoding/base64 (编解码)
  2. ffmpeg+ffserver搭建流媒体服务器
  3. 远程访问mysql设置
  4. Android PDU
  5. python怎么清空屏幕_python如何清屏
  6. redhat yum 安装 mysql_Redhat 7 下Mysql8.0.19安装配置图文详解(配合使用 centos YUM源)...
  7. 我的Dojo中有一个Mojo(如何编写Maven插件)
  8. java awt 按钮响应_Java AWT按钮
  9. Linux定时备份Oracle Database 翻译
  10. 使用 Moq 测试.NET Core 应用 -- Mock 方法
  11. 天猫超市回应“大数据杀熟”传言:系新人专享价未显示
  12. html基础内容样式
  13. 读书:鲁迅的《故事新编》
  14. 产品经理|竞品分析(附《竞品分析报告》模板)
  15. SpringCloud五大核心组件
  16. python画三维投影图_python之画三维图像
  17. 栈(stack)——什么是栈?
  18. jar文件打不开解决的办法
  19. 【2022 小目标检测综述】Towards Large-Scale Small Object Detection: Survey and Benchmarks
  20. (202301)pytorch图像分类全流程实战Task1:构建自己的图像分类数据集

热门文章

  1. 训练创新思维的方法:曼陀罗思考法
  2. 笔记《Graph Neural Tangent Kernel: Fusing Graph Neural Networks with Graph Kernels》-NeurIPS 2019
  3. 王艺瑞浙江大学计算机学院,关于公示2010年(秋)同等学力申请进入论文阶段学员名单的通知...
  4. 硬盘位置不可用因格式变RAW而打不开:文件或目录损坏且无法读取/此卷不包含可识别的文件系统等无法访问错误-CHKDSK被中止
  5. Dedecms错误警告:连接数据库失败,可能数据库密码不对或数据库服务器出错怎么解决?
  6. mac下关闭Chrome的自动更新
  7. [codevs1746][NOI2002]贪吃的九头龙
  8. 无线前端资源管理方案
  9. 华为发布八核处理器Kirin 920
  10. SAP LSMW 创建及使用过程