完整步骤在这里:
https://github.com/kaldi-asr/kaldi/blob/master/egs/aishell/v1/run.sh
下面是从训练对角矩阵开始的

sid/train_diag_ubm.sh --cmd "$train_cmd" --num-threads 16 data/dev 1024 exp/diag_ubm_1024

现在我们使用dev的数据来训练一个对角ubm(universal background model,其实就是个混合高斯模型,那我们要得到什么呢?首先这个混合GMM model里面有多少个高斯分量,这个可以自己设置,一般都设为2048,我也不知道为啥。每个分量的权重多少,针对每个高斯分量,它自己的参数又是多少?均值,协方差。最后根据每个分量的权重,又会舍弃一些权值特别小的分量,所以最后的分量总数不是2048)

除了必要的参数设置,data/dev文件夹下面一定要有的是feats.scp,vad.scp
最后生成的是一个exp/diag_ubm_1024/final.dubm

简单来想一下,每个wav音频经过提取mfcc特征之后都变成了一个高维的向量(其实是一个矩阵),可以理解成一个高维的点,dev集中有很多的wav音频,就是很多个高维的点。根据万物皆高斯的原理,给你一堆点,让你拟合一个符合这个分布的GMM,不是最简单的例子么。可以再复习一下kmeans与GMM的关系。经过vad的feats,还是一堆高维的点啊,根据这些已知的点来拟合一个巨大的混合的GMM,就是train_diag_ubm.sh做的事情。

这里有个问题,上次说对于音频BAC009S0724W0121.wav,就用了一个328行60列的矩阵来表示这段音频了。但是对于不同的音频,行不同,列相同。如果是作为神经网络的输入的话,一定是要行相同,列也相同的。那GMM的拟合就可以行不同,列相同么?没有的维度上补0 ?

下面是过程中出现的:

sid/train_diag_ubm.sh --cmd run.pl --num-threads 16 data/dev 1024 exp/diag_ubm_1024
sid/train_diag_ubm.sh: initializing model from E-M in memory,
sid/train_diag_ubm.sh: starting from 512 Gaussians, reaching 1024;
sid/train_diag_ubm.sh: for 20 iterations, using at most 500000 frames of data
Getting Gaussian-selection info
sid/train_diag_ubm.sh: will train for 4 iterations, in parallel over
sid/train_diag_ubm.sh: 4 machines, parallelized with 'run.pl'
sid/train_diag_ubm.sh: Training pass 0
sid/train_diag_ubm.sh: Training pass 1
sid/train_diag_ubm.sh: Training pass 2
sid/train_diag_ubm.sh: Training pass 3

16个线程,JOB数是4,只需要迭代4次,最多使用50万帧等等,都是提前设置好的参数,可以改的。train_diag_ubm.sh 里面的具体算法有点复杂,主要就是EM算法。

可以用下面命令查看生成的dubm文件
/data/kaldi/src/gmmbin/gmm-global-copy --binary=false final.dubm final_dubm.txt
(当前目录是final.dubm所在的文件夹哦)
vi 打开后可以看到gconst,weights,means_invvars都是什么,
gconst是分量的名称么?为什么还会有负数和小数呢?
GMM模型中这些参数都是什么,可以参考:
http://notes.funcwj.cn/2017/05/28/kaldi-gmm/
(里面的一些数学公式可能显示不出来,但是一kexueshangwang就显示出来了)
简单看个head:

<DiagGMM>
<GCONSTS>  [ -168.5517 -123.4348 -133.8201 -114.7822 -151.7901 -131.8667 -124.6407 -129.6156 -119.5983 -155.5331 -141.1062 -144.089 -135.3436 -140.6498 -141.9448 -137.6184 -140.7643 -157.9445 -133.8951 -125.899 -150.2111 -142.5286 -149.2338 -141.5738 -153.825 -136.6607 -159.3391 -140.8754 -146.2305 -151.0263 -151.3518 -162.195 -120.5273 -129.7599 -144.4364 -143.8065 -140.654 -160.7943 -153.1309 -140.5052 -120.1089 -130.4272 -142.9462 -133.6798 -114.27 -150.9726 -140.6008 -127.8027 -123.2906 -130.6685 -120.9457 -142.8364 -140.9074 -166.1898 -140.0086 -143.3855 -141.0461 -137.6288 -142.4091 -140.5384 -117.8093 -128.3808 -114.6294 -135.5237 -136.9998 -131.7942 -159.4724 -161.2356 -135.8031 -136.7511 -153.5796 -122.48 -147.5269 -132.4072 -145.3951 -129.0328 -144.9012 -146.7066 -143.9918 -109.4965 -118.4989 -140.4688 -121.4221 -132.6446 -148.5265 -130.3394 -152.9196 -120.998 -146.9657 -130.5994 -148.1411 -142.714 -130.3872 -127.2332 -131.0156 -128.2835 -102.3352 -115.6484 -137.4211 -155.424 -120.4553 -147.565 -129.3216 -128.8561 -151.3668 -120.7073 -138.1742 -145.4839 -157.6234 -137.7653 -131.8304 -153.8659 -142.7568 -139.9195 -147.3725 -132.5825 -139.0288 -149.8651 -136.7478 -166.6607 -150.2914 -145.4931 -127.6105 -139.8623 -124.5044 -142.1934 -139.9203 -139.8712 -140.6237 -124.7048 -121.5863 -132.7947 -140.9013 -149.3519 -134.6002 -130.9579 -132.6345 -138.2992 -134.0031 -154.9517 -151.0343 -130.2851 -132.1985 -152.1494 -159.3548 -141.1165 -135.5849 -141.0246 -126.2608 -144.9648 -131.0665 -140.7204 -119.6241 -133.2392 -133.4465 -151.9079 -120.7533 -120.1852 -149.3676 -127.6366 -148.7456 -153.6342 -136.2757 -148.3535 -172.2157 -128.4765 -143.189 -130.8889 -136.148 -142.3808 -140.5706 -133.363 -138.0361 -133.569 -130.6047 -137.8063 -153.92 -121.21 -137.5255 -114.4114 -132.5074 -158.589 -142.1535 -153.6232 -164.781 -149.3506 -126.7041 -151.7858 -127.7331 -134.3621 -109.3684 -126.3345 -136.6309 -123.3525 -149.184 -138.0595 -157.5321 -143.8427 -124.5696 -141.737 -155.216 -122.5459 -135.7957 -153.1994 -140.627 -139.9158 -168.8335 -130.1261 -128.1531 -135.5734 -127.9323 -121.744 -133.6361 -104.6675 -141.379 -124.9319 -141.4576 -134.265 -134.9557 -135.9324 -139.4159 -149.1574 -140.6428 -140.2407 -138.8096 -153.9716 -160.0007 -146.7496 -123.6143 -141.3186 -112.2816 -132.2316 -147.2952 -139.4095 -112.8964 -139.3286 -125.9173 -125.524 -133.4844 -122.9121 -120.0521 -137.1353 -130.6129 -135.6262 -160.6152 -132.3525 -143.2796 -136.9192 -146.1013 -135.1834 -125.4476 -156.847 -127.4387 -132.8019 -155.1817 -128.1749 -135.8983 -113.7031 -143.1815 -131.6463 -129.2455 -125.6998 -145.9696 -158.0379 -139.1829 -127.4171 -135.2646 -162.2996 -163.0162 -145.8636 -147.5465 -125.3382 -121.36 -138.6498 -138.5074 -130.9844 -133.9674 -140.6826 -135.7107 -152.2603 -137.4241 -128.0944 -134.6893 -134.2749 -129.3741 -148.8402 -136.1093 -125.5339 -146.0685 -124.3971 -143.7453 -146.5066 -136.0755 -147.6885 -134.2977 -149.77 -117.9916 -150.5902 -133.5968 -134.2478 -130.1301 -138.2747 -137.1242 -137.3589 -149.1925 -140.1207 -127.9384 -125.8932 -135.2101 -126.0765 -137.2661 -161.6089 -131.5812 -139.9071 -168.8649 -164.025 -126.4064 -121.9461 -141.5728 -121.6942 -149.5763 -153.1793 -106.7022 -110.6957 -120.604 -134.1552 -144.157 -151.4764 -128.287 -145.9172 -127.5017 -140.2188 -121.2034 -136.0809 -117.6584 -135.9354 -145.1597 -130.172 -158.8935 -136.0182 -134.3848 -146.2378 -130.6086 -152.1252 -135.4229 -122.4517 -131.2916 -138.7907 -152.0422 -136.7714 -137.7505 -145.3691 -149.9497 -140.0387 -140.2083 -114.1855 -135.8274 -152.9926 -130.4249 -123.6626 -126.6622 -159.7115 -129.1124 -162.8806 -132.3096 -136.7282 -125.7833 -147.7782 -135.8698 -125.9852 -132.1975 -134.5161 -165.0727 -147.7679 -155.4951 -112.4326 -148.6761 -140.9041 -130.0811 -121.6111 -131.776 -132.5161 -150.8849 -124.688 -125.1094 -131.141 -141.3903 -133.5291 -134.6705 -140.4974 -133.4458 -129.5898 -121.4164 -131.7224 -129.7135 -139.5248 -134.6768 -135.5128 -144.2457 -128.2253 -140.672 -141.6892 -140.4842 -143.7521 -145.0936 -145.6931 -131.8956 -135.586 -128.9934 -114.0958 -147.4272 -130.5117 -133.5587 -130.3018 -146.4075 -153.944 -148.4697 -136.1697 -133.8414 -122.3045 -139.8435 -136.6656 -142.2034 -143.338 -149.8649 -136.9045 -127.2488 -144.9095 -154.9302 -147.4602 -117.4799 -137.6045 -148.7106 -128.6325 -140.7392 -153.8039 -135.3087 -105.483 -143.2361 -146.4795 -135.1578 -145.0745 -151.1003 -130.402 -134.8598 -152.1717 -125.1529 -138.595 -133.6958 -151.1548 -140.1014 -162.5487 -143.1621 -128.5319 -159.4665 -150.3442 -120.5501 -143.6444 -134.6664 -125.3759 -135.7504 -154.0327 -148.0756 -139.3791 -139.238 -135.8434 -148.0824 -137.0081 -150.5679 -112.5646 -148.5415 -117.2756 -144.8058 -130.8228 -110.3884 -118.137 -130.0291 -117.237 -114.7207 -138.6573 -109.821 -113.1791 -128.679 -116.6654 -113.9586 -120.0652 -127.8632 -123.4494 -127.1957 -115.4604 -136.3827 -117.5852 -134.8629 -117.2011 -136.0327 -118.4211 -134.9663 -142.6711 -121.6644 -114.5919 -109.8936 -113.0581 -120.7176 -119.6816 -108.4369 -116.5927 -118.0388 -123.3612 -150.9342 -152.3249 -112.436 -109.1332 -109.179 -115.398 -123.2784 -112.3223 -128.1319 -116.8282 -106.4612 -107.7429 -128.5541 -115.3369 -110.0671 -127.8976 -132.1649 -126.1945 -126.2208 -113.9105 -114.1658 -119.7805 -124.0996 -121.2134 -121.8631 -123.789 -133.2199 -127.6365 -121.1162 -140.2933 -129.793 -132.5981 -131.2865 -115.6388 -136.4279 -142.1064 -108.8168 -139.4755 -123.9469 -131.061 -129.2012 -127.7298 -123.1486 -128.1555 -107.9403 -114.3784 -139.136 -136.123 -129.4115 -129.5293 -126.4045 -125.5002 -127.4661 -128.1968 -132.1601 -113.8633 -111.5471 -103.5879 -117.6509 -133.6172 -106.2767 -105.6394 -130.7527 -111.0063 -119.1499 -140.0991 -133.0108 -119.1382 -139.7953 -118.8322 -141.0854 -122.652 -133.3576 -132.662 -135.9778 -105.0654 -130.2086 -154.0142 -111.4836 -124.1987 -122.4706 -136.5257 -131.8087 -115.4328 -115.7144 -112.701 -132.6026 -132.0461 -102.7311 -113.9282 -147.3691 -134.5996 -109.1603 -132.7412 -134.3094 -133.0096 -145.5055 -152.3269 -114.023 -148.6696 -141.7355 -123.7425 -159.8965 -102.1794 -108.4105 -122.6972 -126.1676 -132.8054 -114.3668 -137.9692 -123.5095 -123.5784 -139.9501 -149.816 -133.8239 -127.9666 -116.7658 -106.2474 -142.6747 -135.5598 -110.2893 -116.1492 -102.7393 -137.5147 -125.4834 -151.7289 -114.5708 -145.0391 -119.9022 -121.6045 -132.0187 -144.0502 -128.0489 -141.6433 -136.5684 -115.035 -121.7155 -109.265 -120.0038 -127.7607 -121.9754 -137.3721 -142.2109 -129.2093 -120.4347 -114.6065 -112.1573 -128.4038 -135.1196 -138.4161 -164.6638 -152.6776 -118.3095 -123.77 -105.0601 -138.522 -127.9251 -137.4612 -107.7167 -129.7034 -115.6126 -120.6616 -135.5858 -135.8989 -123.7609 -132.5893 -127.3985 -133.5243 -152.3719 -118.6119 -141.8845 -133.5457 -129.7389 -151.7316 -140.4838 -127.3351 -135.6635 -135.2687 -130.4287 -121.4925 -137.361 -145.1585 -142.5096 -97.78353 -124.6344 -116.6442 -140.2335 -131.056 -135.0486 -131.4666 -104.6255 -119.9211 -112.4795 -126.1451 -106.2459 -130.2433 -107.882 -120.0558 -113.3636 -145.006 -135.1932 -138.4538 -145.5085 -146.9653 -120.3658 -132.2691 -129.6325 -144.812 -110.3562 -129.259 -142.0971 -107.3322 -111.6298 -117.0875 -118.3415 -155.0153 -96.34048 -147.7771 -127.0667 -136.0504 -122.446 -133.7134 -128.4926 -143.5608 -130.4109 -147.2307 -109.2763 -131.0805 -145.8178 -127.8922 -119.9944 -113.5458 -140.8062 -140.6818 -134.1404 -125.9092 -128.3159 -135.6806 -137.0267 -123.1888 -126.9309 -129.0802 -126.4642 -137.2626 -136.2963 -124.1761 -136.4276 -135.6142 -143.2414 -121.044 -124.4154 -118.9165 -150.9047 -156.7093 -112.8713 -120.473 -145.4004 -143.2721 -140.1841 -129.9592 -136.651 -143.3927 -134.5494 -130.8368 -132.8942 -147.6479 -113.2776 -140.8528 -144.499 -129.1364 -121.364 -116.5077 -147.247 -106.084 -116.9772 -128.0942 -113.3637 -132.0165 -122.1363 -136.6333 -133.6756 -147.8817 -135.3727 -125.9566 -127.0718 -128.822 -138.9064 -126.4806 -142.4079 -129.0056 -144.1167 -119.966 -140.8514 -133.7383 -120.2155 -142.0414 -126.1896 -135.5472 -122.1589 -132.1864 -135.1242 -145.1986 -118.0691 -141.9248 -116.4534 -131.7052 -112.3564 -112.4963 -127.9443 -139.9226 -116.0883 -137.3257 -132.7593 -143.373 -118.7111 -128.3941 -129.3248 -145.7266 -130.339 -134.971 -133.6227 -130.5445 -136.4407 -150.7239 -136.9688 -133.1618 -130.7014 -144.9384 -114.1385 -131.5274 -125.2469 -139.2286 -131.5861 -136.3109 -136.0261 -138.9444 -139.3573 -141.1163 -131.265 -132.704 -157.8972 -159.1407 -142.1942 -150.9301 -146.1201 -131.7576 -130.758 -158.8962 -126.1001 -138.5498 -136.0849 -140.5055 -133.235 -130.1385 -111.6726 -126.8055 -113.1545 -136.6334 -137.4222 -119.5909 -132.1279 -120.1718 -116.8517 -118.6632 -132.9723 -127.6355 -130.1844 -124.4003 -107.0525 -114.3542 -121.8363 -127.3008 -150.8082 -122.7728 -124.6926 -141.3457 -125.8128 -145.3772 -124.5326 -105.6236 -133.0691 -128.4308 -135.6479 -136.6485 -120.8425 -124.1547 -130.2749 -113.4296 -142.489 -127.9135 -148.612 -118.0814 -124.5433 -158.48 -130.542 -136.6844 -131.0564 -146.732 -145.6229 -134.3411 -158.6641 -148.7942 -116.4812 -143.0144 -131.102 -137.1155 -131.877 -112.6555 -140.1234 -143.7899 -130.04 -120.1367 -115.8177 -102.8452 -132.5429 -135.6685 -121.303 -137.8269 -142.2375 -122.0038 -108.1188 -129.834 -140.5089 -113.0632 -129.7294 -134.6061 -135.653 -99.84734 -107.5975 -138.2833 -130.2392 -139.9353 -137.3117 -109.4608 -139.9711 -125.7823 -135.9701 -113.2854 -153.4162 -173.8303 -116.1844 -125.7708 -137.7507 -131.2065 -138.063 -140.1509 -117.9442 -137.7581 -120.7781 -130.5949 -138.1464 -120.4804 -128.9505 -125.691 -151.4106 -100.2253 -134.0161 -145.4214 -152.3331 -114.2931 -106.6924 -136.3821 -124.275 -130.8945 -137.273 -139.273 -133.3651 -135.905 -119.4938 -142.8046 -123.9047 -128.1923 -127.2908 -128.1913 -117.4398 -140.2155 -121.041 -114.3894 -135.1122 -139.5002 -111.5211 -112.4945 -107.8582 -120.6551 -128.5418 -152.908 -153.5582 -141.614 -126.1192 -137.7973 -162.7703 -112.6749 -125.1983 -117.6061 -151.9657 -139.2378 -137.3553 -145.623 -135.381 -140.9684 -142.6324 -146.5051 -127.6956 -123.562 -146.3421 -150.71 -136.832 -124.5916 -136.4901 -121.3504 -120.333 -101.9702 -136.6512 -144.4799 -111.3158 -130.0637 ]
<WEIGHTS>  [ 0.0011068 0.001587202 0.0007463233 0.0007865152 0.0007990212 0.001196475 0.0008616272 0.0009741692 0.001390153 0.0009692346 0.0008880548 0.0009997055 0.001150572 0.000892639 0.000909156 0.0009387451 0.0009454571 0.0006536849 0.001220323 0.001170463 0.0009707686 0.001000407 0.000806024 0.0007653495 0.0007718971 0.0009957516 0.000559394 0.0009186999 0.0007671865 0.0009666851 0.0007846465 0.001010104 0.001373252 0.001053545 0.0007890331 0.001081474 0.0009659806 0.0007751822 0.0007482817 0.0007908923 0.00114984 0.001077101 0.0007591745 0.001229645 0.001327407 0.0009279329 0.0007050617 0.0008495388 0.0009272065 0.0007988781 0.001121042 0.0008109472 0.001081998 0.0009830003 0.0008015115 0.0009254331 0.0009142506 0.0007207548 0.0009890347 0.0009162937 0.001246491 0.001021293 0.0008319031 0.001028408 0.00102481 0.001065388 0.0009406096 0.0007257905 0.0008778835 0.0007476977 0.001219824 0.0007508551 0.001081344 0.0009959239 0.001026544 0.001119189 0.001102163 0.0009397262 0.001142834 0.0007832733 0.001159922 0.0009548706 0.000727284 0.0006640842 0.0008713582 0.001223444 0.001029095 0.001059789 0.001170862 0.0009328867 0.0006836139 0.0007632865 0.001085197 0.0009513039 0.00101931 0.00116072 0.001252134 0.00101308 0.0008795339 0.001055012 0.00100801 0.0008057043 0.0008351408 0.001017999 0.001126815 0.001163093 0.000960787 0.00107202 0.0007366747 0.001014299 0.0008591401 0.001173406 0.0009498527 0.0008466808 0.001001329 0.0008663117 0.0009570117 0.001048565 0.001220044 0.001078781 0.0009321235 0.00100331 0.0008916259 0.0007222798 0.001106135 0.001036583 0.0009899568 0.0007911813 0.001025466 0.001002455 0.001117791 0.00092463 0.0008615711 0.0007592412 0.001193387 0.00105682 0.001097459 0.001106507 0.001009366 0.0009734376 0.001099895 0.0011464 0.0009734764 0.0008014641 0.0009931506 0.0008705879 0.0006847327 0.0009564002 0.000901656 0.001045643 0.001096586 0.0009109275 0.001244145 0.001076677 0.0009922611 0.0009221233 0.001228288 0.0009772603 0.0007228285 0.001049067 0.00113019 0.001085738 0.0006161294 0.001306626 0.0006199583 0.001145177 0.0008713289 0.001294666 0.001113994 0.0009252689 0.001044725 0.001064965 0.001074013 0.0008664316 0.00105372 0.000816335 0.0008567261 0.0008745572 0.000877934 0.0009264463 0.0007914156 0.0009493554 0.0008089466 0.000936009 0.0009079427 0.0008195526 0.0009700517 0.000522843 0.000986431 0.0008274256 0.001013183 0.001133374 0.0009044543 0.0009006236 0.000795398 0.001019885 0.0007841844 0.001424899 0.001082041 0.001002464 0.0007821572 0.00102846 0.001032175 0.0008202994 0.0009652909 0.0009384998 0.0007736447 0.001067729 0.001255908 0.0009350528 0.0008069419 0.00116712 0.001193904 0.0009718073 0.001162331 0.001224325 0.001007123 0.0006967225 0.001296111 0.0009295176 0.0008137836 0.0008469027 0.001013134 0.0009327521 0.0009960295 0.0009348096 0.0009804637 0.000750187 0.0009966696 0.0009485449 0.001101059 0.001049047 0.001043805 0.001215759 0.0009646514 0.00118139 0.001037795 0.0008929282 0.001016548 0.001097244 0.001039535 0.0008782162 0.0008758016 0.0009539909 0.0009125094 0.0007351286 0.0008115937 0.0009929638 0.0007880733 0.001215868 0.001133705 0.0006302312 0.001116741 0.0009532111 0.0007996427 0.001060015 0.001027821 0.001083657 0.0008127945 0.0009923235 0.0007852551 0.0008247576 0.0007939122 0.001160634 0.0008108501 0.001032743 0.0009212708 0.0008994745 0.001022774 0.0009005373 0.0008749668 0.0009447677 0.00120943 0.000801941 0.001098405 0.001112264 0.001270797 0.001098169 0.0007862018 0.0008039983 0.0009014733 0.001043789 0.001076221 0.001328479 0.0009702138 0.001071413 0.001049567 0.0007872565 0.001123285 0.0008591172 0.001090682 0.0008780765 0.0009863075 0.0006756671 0.00112976 0.0007170125 0.001021288 0.0008929567 0.0009984347 0.001232252 0.001248783 0.0007283564 0.001123966 0.0009445815 0.0007145025 0.0007922535 0.0009238379 0.001190158 0.0009754542 0.001116645 0.0008892238 0.0009455658 0.001016184 0.001024319 0.0007720785 0.0009935054 0.001143233 0.0008743114 0.000849105 0.001037722 0.0007886557 0.0009766645 0.0009878154 0.0009450638 0.0008786656 0.001233362 0.001243993 0.001065811 0.0009236113 0.0007764056 0.0008559889 0.000800968 0.001019292 0.001071064 0.0009658664 0.0009700672 0.0006977918 0.001112922 0.0007395761 0.001097843 0.0008815692 0.0006161141 0.001046481 0.001050262 0.0009638027 0.0008951116 0.001184924 0.0009778871 0.0009259957 0.0009306781 0.0008326881 0.0009268583 0.001050826 0.001092848 0.0009349638 0.00082398 0.001162283 0.0006444685 0.000920315 0.0008363435 0.001091744 0.0006779094 0.0008694745 0.0008218211 0.001074362 0.001040523 0.001249242 0.001086514 0.0009425626 0.001019603 0.001024458 0.0008243957 0.0008471459 0.0007992034 0.001288408 0.0008771851 0.001135647 0.0008978263 0.0007827354 0.001173302 0.001191676 0.001172106 0.0008189736 0.001133125 0.001120296 0.001122704 0.0007357521 0.00104145 0.0009051191 0.001003921 0.001043792 0.0009475801 0.000974751 0.0009163336 0.0009785395 0.0008771993 0.001037147 0.000918894 0.0008434072 0.001023835 0.0007839086 0.0008698659 0.001265087 0.0006656632 0.0008194363 0.00098635 0.000852057 0.0008632793 0.0007027657 0.001009119 0.001202099 0.001025118 0.001104615 0.00116414 0.0007562454 0.000781209 0.001114974 0.001001723 0.0007407159 0.0009696311 0.000813493 0.001375999 0.0008394729 0.0009631117 0.0008456808 0.0009817611 0.0009965472 0.000844062 0.001077488 0.0008288455 0.0009362187 0.00105563 0.000895844 0.001329567 0.001148257 0.0008137234 0.0008865572 0.0009983827 0.001216448 0.0009224525 0.001161515 0.0009462127 0.0009505623 0.001021581 0.00112273 0.0008384064 0.0009345466 0.0007206476 0.001008294 0.0009676744 0.001028272 0.000700797 0.001003422 0.0008312624 0.001095986 0.0008490161 0.00119755 0.0009970623 0.0009759653 0.0008723229 0.000878187 0.001077147 0.0009227308 0.0006720679 0.000710225 0.0009845654 0.0009752471 0.0008186856 0.0008804006 0.000962389 0.0009220197 0.00114546 0.0008782442 0.0005320025 0.001158947 0.001020196 0.0007497177 0.001061987 0.0009402268 0.0008373352 0.0009512942 0.001102037 0.001299211 0.001201109 0.0009801739 0.00110437 0.001017547 0.001132684 0.0006501562 0.0009306427 0.001081909 0.000878125 0.000906715 0.0009742277 0.001023896 0.001169294 0.001314035 0.0006162439 0.001146304 0.0008783693 0.001275216 0.000899425 0.001071924 0.0009122282 0.0009486207 0.00110944 0.001115811 0.0009634285 0.0007151271 0.0008853355 0.0008129139 0.0008703459 0.0008021616 0.001132993 0.0009903486 0.001247864 0.001047362 0.000952859 0.0009907631 0.001121163 0.001111669 0.0009028241 0.0009169802 0.0008648923 0.001062139 0.0007298256 0.0008781757 0.001004914 0.001259621 0.001009574 0.001119667 0.0008038381 0.00115578 0.000883916 0.001128834 0.0009756132 0.00111833 0.0009123006 0.0008024367 0.0009354173 0.0005003358 0.001070219 0.0009713947 0.0007139069 0.001007733 0.0009442033 0.001028949 0.0008293585 0.001104604 0.001230458 0.00135384 0.001039208 0.0009473856 0.001152998 0.0009713329 0.001211313 0.001083047 0.001039202 0.001184956 0.001022242 0.0007549305 0.0007337353 0.001118293 0.0009246807 0.001164965 0.001095602 0.001093888 0.001004055 0.0009849244 0.000929293 0.001096423 0.001252425 0.001194288 0.0008057391 0.001141782 0.0009313908 0.0007799588 0.001162007 0.0009049855 0.001189687 0.001082393 0.001208913 0.0008957619 0.001409426 0.001082244 0.0006078228 0.001594119 0.001316395 0.0009829073 0.000843545 0.0007820086 0.0008455078 0.0009362829 0.001019908 0.001442038 0.001138097 0.001210967 0.001029296 0.0007993414 0.0005536536 0.001308666 0.001050879 0.001010634 0.001271465 0.0007756071 0.001052783 0.001213507 0.001058382 0.0009414324 0.0009028628 0.0009032743 0.001141214 0.0008948573 0.00114591 0.0009138205 0.0008743712 0.0006172007 0.0006530177 0.001136744 0.001288641 0.0007149322 0.001003426 0.0009276656 0.001063681 0.001011813 0.001252064 0.0007521355 0.0008338633 0.001380413 0.0008885451 0.0009641867 0.001069305 0.0008430021 0.0009413899 0.0009910377 0.001180957 0.001016727 0.001070012 0.001139318 0.0008542115 0.0009889617 0.0007171387 0.0008122745 0.001200803 0.001150077 0.0009270346 0.0008006504 0.0005389926 0.001099186 0.001225633 0.001011815 0.0007418834 0.001134231 0.001023308 0.0007915297 0.0009608408 0.001111359 0.00087757 0.0008864938 0.0008684347 0.001136722 0.001249432 0.001062028 0.0009950724 0.001326896 0.0008843311 0.0009346225 0.0008233033 0.001127957 0.0008423575 0.001026792 0.001053782 0.001194522 0.00100844 0.0007612198 0.0009318764 0.0007913386 0.0009574485 0.001181468 0.0009223242 0.001026052 0.0008897575 0.001029664 0.00119795 0.001130447 0.001196815 0.00116625 0.0008407071 0.0008284054 0.001010242 0.0008274943 0.001253622 0.00115739 0.001379108 0.0009774064 0.0009020481 0.0009166629 0.0007906354 0.000993224 0.0008187407 0.0007339713 0.0009662876 0.0009690147 0.0008476632 0.000985116 0.001245336 0.0008714707 0.00113036 0.0008331733 0.0008982512 0.0008432762 0.001030236 0.001144438 0.0009623239 0.001051295 0.001125993 0.001092563 0.001102643 0.001315502 0.001079186 0.001297256 0.001533449 0.001134183 0.001062355 0.001147519 0.001054137 0.001057918 0.001125532 0.001269234 0.001541417 0.001080461 0.001103285 0.001206149 0.0008335871 0.0009428069 0.001025857 0.0009437571 0.0007589766 0.001159769 0.001306221 0.0009708087 0.0009468445 0.001300233 0.0007251885 0.000882524 0.001192995 0.0008827257 0.0009675216 0.001183639 0.001061805 0.000940682 0.001045214 0.0006397251 0.0008767993 0.0006253974 0.000935561 0.0007846209 0.0009958164 0.001143093 0.0009119181 0.0008470981 0.000959707 0.001417052 0.0009870076 0.0008354062 0.0007879662 0.00101558 0.001307569 0.001067636 0.0009523348 0.001271924 0.001248678 0.001125012 0.0008997571 0.001179744 0.0009703147 0.0009621977 0.001450228 0.0009218616 0.001223065 0.0008093573 0.0008707681 0.001083404 0.0009992023 0.001311896 0.000858245 0.001092574 0.001098419 0.001060377 0.000726603 0.001023082 0.0009796394 0.001210458 0.001081239 0.001219566 0.001136436 0.0007743248 0.0009744433 0.0009719715 0.0008507308 0.0008767272 0.0008684593 0.001340895 0.000749827 0.001023512 0.0008213553 0.001127994 0.001216572 0.001209929 0.0008383004 0.000759642 0.00111786 0.0009328116 0.001163917 0.001003457 0.0008462216 0.0006920684 0.001009188 0.001340004 0.000930703 0.001078721 0.0008655426 0.00101983 0.001026732 0.001194061 0.0008527294 0.001092474 0.0009049737 0.000854197 0.00115034 0.0008222983 0.00065265 0.0009092777 0.000830026 0.0008388902 0.001051498 0.0007663698 0.0009941155 0.001052918 0.000977123 0.001000625 0.001095497 0.001132503 0.0008442253 0.0008468283 0.001334643 0.00111468 0.0009835385 0.0006867731 0.000750391 0.001161917 0.0008988205 0.0009415463 0.001078457 0.0007558576 0.000798589 0.001300171 0.0009857084 0.0007510338 0.0007966257 0.0009340894 0.001181609 0.0008223109 0.001021806 0.000940235 0.0009253292 0.0008881488 0.001218624 0.000916209 0.001024969 0.00123599 0.0008252095 0.0009181129 0.0009778803 0.001159921 0.001321249 0.000824433 0.0009268258 0.001166578 0.0008976082 0.0008795294 0.0008648402 0.001348478 0.0007865604 0.0009451899 0.0009386957 0.0008888441 0.001016049 0.001001649 0.001002833 0.0009266452 0.001019064 0.001039295 0.001209627 0.0007803416 0.001073028 0.0008309651 0.0007758213 0.0008188969 0.0007974615 0.00103651 0.0008587135 0.001029236 0.0007847497 0.001133176 0.001022743 0.0008120248 0.0009724168 0.0007658618 0.0007542765 0.00088825 0.0008784303 0.001201669 0.001089506 0.001043364 0.0007554571 0.001105538 0.001003065 0.0008488902 0.001018935 0.0009932966 0.001031236 0.0009509723 0.0009865817 0.001028859 0.001099152 0.001228605 0.001034992 0.001207348 0.0008367907 0.001105133 0.0008211829 0.001047083 0.001001073 0.001102984 0.0009556398 0.0008596774 0.0009818297 0.001049001 0.00111743 0.0006945709 0.0009740257 0.0009238962 0.001051411 0.001272134 0.00131693 0.0008842227 0.0008074488 0.0008700176 0.0009475974 0.001080907 0.0009927452 0.0008550014 0.0009527968 0.0009005847 0.0008149749 0.0007357437 0.0007537986 0.0009379553 0.0009648139 0.0006864414 0.0007545323 0.0007073009 0.0008749181 0.0007827102 0.000897106 0.0008571576 0.001045196 0.0007478844 0.00104136 0.0009896196 0.0007465029 0.000951812 0.0009136021 0.001149356 0.000941528 0.000932759 0.0008145184 0.001234828 0.0009228297 0.0008599897 0.001034319 0.0008741652 0.0009260857 0.0008913527 0.001007735 0.0009464993 0.0008090531 0.001059185 0.00101454 0.0008577525 0.001097988 0.0008595414 0.0008064451 0.001006287 0.0009901655 0.0008362484 0.0009261838 0.0007329686 0.0008389379 0.001130761 0.0008779621 0.0007017283 0.000650667 0.0008419697 0.0006803587 0.0009602706 0.0008048798 0.0008313484 0.0007097562 0.0006831653 0.0007274419 0.0007741128 0.0007983499 0.0008315475 0.001056528 0.0008730093 0.001296314 0.0009979646 0.001028583 0.0007440595 0.0007181569 0.001087983 0.0008534379 ]
<MEANS_INVVARS>  [-0.3691286 -0.7612579 0.4133788 -0.008243892 0.155071 0.01934223 0.07031387 0.04019918 -0.05293203 0.06376284 -0.01933178 0.01854775 -0.03849428 -0.006190215 0.01679744 -0.01494662 -0.08331328 0.02913744 -0.08784764 -0.1550963 5.783783 0.3382353 -0.321707 0.5452974 -0.8016703 -0.5252054 -0.5358236 -0.320221 -0.3783515 -0.2297121 0.07914644 -0.04414753 0.09135744 0.06660827 0.1613672 0.2681586 0.2156018 0.8651927 0.7390725 0.3915239 14.58918 4.367587 -1.898968 0.6075097 -2.65016 -1.758048 -1.521758 -0.6204146 0.1662107 -0.8328322 0.1915271 -0.5168325 0.07958407 -0.7807627 -0.5214539 -0.2640421 0.6359934 0.4019113 1.829137 3.879026 -2.962274 -0.2063778 0.06743504 0.1044037 0.1866357 0.1561655 0.1750038 0.1670152 0.1086037 0.07466064 0.1354892 0.1388678 0.1128836 0.1352201 0.1538395 0.1511284 0.1424095 0.07677746 0.05641692 -0.001321141 -0.3288379 0.005483055 -0.0924904 -0.03812382 -0.03450923 -0.07808585 -0.08410891 -0.04883831 -0.01933702 0.006084827 -0.02337423 -0.01561838 -0.03149309 -0.04027946 -0.01626203 -0.01025907 0.0252059 0.06748287 0.04012919 0.05639087 -12.99426 0.1021747 -0.3815793 -0.3775249 -0.3153312 -0.4613559 -0.6979104 -0.4207435 -0.03734165 -0.1084071 -0.4683482 -0.5058808 -0.321712 -0.4863708 -0.3402151 -0.3107959 -0.2335471 0.01526861 0.2214068 0.2525001 1.84444 -0.07497685 -0.136919 0.04494506 -0.1753943 -0.1747912 -0.084792 0.1046723 -0.04205658 0.06666286 0.002008226 0.09094954 -0.03396274 -0.02070604 -0.01337663 -0.02545875 -0.08546341 -0.02607786 -0.06552417 -0.144749 -6.109453 0.4680264 0.8744445 0.672425 -0.009394559 -0.05138379 -0.07366442 -0.3123111 -0.1528986 -0.1783032 0.2676751 0.04859122 0.2159922 0.2031726 0.1175777 -0.1295535 -0.2119143 0.02473523 -0.3466471 -0.03287368 -12.61396 2.624438 0.5793518 -0.5736759 2.816352 1.200431 0.9962755 -0.6052314 0.8645247 -1.799471 -0.6197203 -1.747931 0.3431774 -0.145746 -0.1559872 0.3499773 0.5686199 -0.457497 0.7343623 1.79025 -1.090676 -0.1386361 0.05549732 0.07809679 0.1175499 0.1594289 0.09707978 0.1127594 0.1027289 0.04967378 0.1279875 0.1189652 0.1554932 0.09551752 0.1056062 0.1226566 0.1192773 0.1608773 0.2993005 0.2443123 7.420984 -0.1715477 0.2344371 0.259454 0.2967178 0.3794438 0.4154487 0.3944435 0.3049981 0.2508245 0.2895325 0.3304023 0.3961924 0.3042122 0.2854415 0.2449406 0.3211921 0.4622654 0.5587252 0.4714258 -18.41513 -0.7532524 -1.205547 -0.7790083 -0.6162612 -0.1663454 0.2871161 0.2297778 0.0386747 -0.2993824 -1.104734 -1.783409 -1.564031 -0.5610726 0.05322615 0.354266 0.2031139 -0.6783442 -2.59767 -2.861601 
sid/train_full_ubm.sh --cmd "$train_cmd" data/dev exp/diag_ubm_1024 exp/full_ubm_1024

现在我们使用dev的数据来训练一个full-ubm(full-GMM),需要前面已经训练好了对角的ubm

除了必要的参数设置,data/dev文件夹下面一定要有的是feats.scp,vad.scp。
exp/diag_ubm_1024文件夹中一定要有final.ubm or final.dubm
最后生成的结果是exp/full_ubm_1024/final.ubm

至于为啥要先得到一个对角的,再得到一个full的,好像有个paper有复杂的数学论证,参考:(待更新网址)

可以用下面命令查看生成的ubm文件
/data/kaldi/src/fgmmbin/fgmm-global-copy --binary=false final.ubm final_ubm.txt
(当前目录是final.ubm所在的文件夹哦)

vi 打开后可以看到gconst,weights,means_invvars都是什么,和上面一样
GMM模型中这些参数都是什么,可以参考:
http://notes.funcwj.cn/2017/05/28/kaldi-gmm/

sid/train_ivector_extractor.sh  --num-iters 5 exp/full_ubm_1024/final.ubm data/dev  exp/extractor_1024

ivector_extractor在文献中说的就是T矩阵
这一步需要之前生成的ubm文件和feats.scp文件,生成的文件是exp/extractor_1024/final.ie,这个final.ie就是T矩阵

exp/extractor_1024文件夹里还有个5.ie,我感觉这个final.ie就是5.ie的软链接,它俩大小是一模一样的,过程中的显示如下:

那么想看看T矩阵是啥样子行不行呢?下面是我找到的答案:

看样子是看不了

sid/extract_ivectors.sh exp/extractor_1024 data/dev exp/ivector_train_1024

这一步,输入主要是这三个文件:final.ie ,final.ubm ,feats.scp
生成的文件主要是:ivector.scp,num_utts.ark,spk_ivector.scp
还有log文件

ivector.scp和spk_ivector.scp都是路径文件,前者是说dev集中每句话提取的ivector保存在哪里,后者是说dev集中每个说话人提取得到的ivector保存在哪里。

ivector.scp是concat得来的,因为设置了nj=30,所以它是由30个小文件合并的
最后的ivector.scp是经过ivector-normalize-length生成的
num_utts.ark是经过ivector-mean生成的
spk_ivector.scp是经过ivector-normalize-length生成的

vi ivector.scp可以看到前几行是这样的,共有14329行

BAC009S0724W0121 exp/ivector_train_1024/ivector.1.ark:17
BAC009S0724W0122 exp/ivector_train_1024/ivector.1.ark:4092
BAC009S0724W0123 exp/ivector_train_1024/ivector.1.ark:8170
BAC009S0724W0124 exp/ivector_train_1024/ivector.1.ark:12227
BAC009S0724W0125 exp/ivector_train_1024/ivector.1.ark:16311

vi spk_ivector.scp可以看到前几行是这样的,共有40行,我们的dev数据中确实只有40个说话人

S0724 exp/ivector_train_1024/spk_ivector.ark:6
S0725 exp/ivector_train_1024/spk_ivector.ark:1622
S0726 exp/ivector_train_1024/spk_ivector.ark:3238
S0727 exp/ivector_train_1024/spk_ivector.ark:4854
S0728 exp/ivector_train_1024/spk_ivector.ark:6470
S0729 exp/ivector_train_1024/spk_ivector.ark:8086

我想把num_utts.ark和spk_ivector.ark都转换成txt文件看一看,但是下面的命令都出错了,
/data/kaldi/src/featbin/copy-feats ark:num_utts.ark ark,t:num_utts.txt

Failed to read matrix from stream.  : Expected "[", got "354" File position at start is 6, currently 9

/data/kaldi/src/featbin/copy-feats ark:spk_ivector.ark ark,t:spk_ivector.txt

 Failed to read matrix from stream.  : Expected token FM, got FV File position at start is 8, currently 11

看上去好像错误都一样,但是不知道怎么改。
经人指点:
mfcc是矩阵,用copy-feats没问题,spk_vector这些是向量,你要用copy-vector

原来还是用的工具不对。

用下面命令转换spk_ivector.ark成功
/data/kaldi/src/bin/copy-vector ark:spk_ivector.ark ark,t:spk_ivector.txt

vi spk_ivector.txt可以看到有40行,每行是一个说话人的i-vector向量

S0724  [ -1.494889 4.191723 -2.065231 6.424363 -0.568639 1.312023 3.608217 -4.009121 0.1447516 -2.923756 -0.5574226 1.440925 -3.323975 -2.562852 3.096631 -0.2247497 0.4704453 1.913887 -1.113701 -1.201724 0.6903941 1.167155 -0.8199453 3.009851 -2.01485 3.375238 4.916361 -1.110913 2.647804 -0.6668168 -2.410867 -4.51026 -7.304629 2.184405 3.194905 -1.498284 0.553113 3.858151 -1.511438 -4.013847 -2.028534 1.272021 -0.1734654 0.9513246 3.398404 0.4707038 -2.392773 -2.98877 0.9945849 -0.3856373 -1.920934 -1.049368 -1.160322 0.6684188 0.5707002 0.1175598 0.9667767 -1.02162 0.653752 -0.2854674 0.2077845 0.1745595 -0.7070389 -0.1560595 0.1427419 -0.4050522 -0.1530222 -0.1362765 0.06851027 0.1643819 0.2111228 0.1814216 -0.6835693 -1.014317 0.4377705 1.108795 0.03665004 -0.6145264 0.05215325 0.5097237 -0.4897903 -0.08710022 -1.574562 0.6131901 0.5488392 -0.1153821 0.05639264 0.4592863 0.2603703 -0.4789504 -0.414952 -0.6252485 0.1550305 -0.4125877 0.01622202 -0.2704884 0.5788739 -0.2037064 -0.5070553 -0.5928639 0.04789318 -0.4258216 0.228256 0.5971364 -0.06404956 -0.03200082 0.1308726 0.1356846 -0.2531437 0.1886812 0.1081557 -0.4833564 -0.678308 -0.04845379 0.09927486 -0.9495237 -0.2245603 -0.3165013 0.02694877 0.2020769 0.4898948 -0.3703753 0.2678764 -0.1782853 -0.007617008 0.3214126 -0.1696634 -0.01588814 0.6087376 0.1138676 0.06505909 -0.01939752 -0.1003009 -0.07192703 -0.2716809 0.2042752 0.03706297 0.006570957 0.04687546 -0.1889757 0.04962946 -0.4122401 4.747756e-05 0.1157316 0.3687924 -0.4359643 0.2279454 -0.2779478 -0.1074038 0.008337801 0.1925775 0.3197043 0.1443411 -0.1369099 0.8319659 -0.08869965 -0.07899147 -0.4317599 -0.2138949 0.3487202 0.07691551 -0.2397741 -0.3202068 -0.3259673 0.2402112 -0.1903881 -0.250883 0.2644596 0.3829333 0.3844469 0.0740787 0.0932843 0.3846858 -0.3141307 0.5252094 -0.1991133 0.2100554 -0.02659893 -0.3463546 0.2528807 0.2810422 -0.2118971 0.005206075 -0.08887724 -0.1757096 -0.1740363 0.1263264 -0.1046663 -0.1455405 0.1333608 -0.2458466 0.05562488 0.1959593 -0.1207889 0.103675 0.01059854 0.4145905 -0.1224965 0.2875637 0.1132103 0.4379716 -0.3254119 0.2134016 -0.1532089 -0.1043074 -0.04225086 -0.09411705 -0.3552141 0.1115215 -0.09294089 -0.01204276 -0.1291037 -0.08631192 0.09826779 -0.1650585 -0.2444597 0.185434 0.0730179 0.03874771 0.6316848 -0.1259742 -0.01365566 0.09506631 0.160343 -0.1040568 -0.06245096 0.0087858 -0.1986545 -0.02488568 -0.2329771 0.2102711 0.1532938 -0.08635561 -0.170257 -0.2415882 0.1320202 0.06303477 0.2031551 0.333845 -0.2798648 -0.1150485 0.1489522 -0.1097059 0.2288545 0.07462427 0.2448708 -0.08298375 0.07950476 -0.07130237 -0.2022092 -0.03802164 0.03048396 0.251083 0.2141706 -0.02539639 0.0585734 0.2051167 0.3220056 -0.1617552 -0.2211852 0.4313145 -0.02947356 0.04032288 -0.06955115 0.2292648 -0.04206774 -0.4483271 -0.03293414 0.009052359 0.02929015 -0.4798698 0.02724461 -0.1812279 -0.01284112 -0.2944111 0.05225592 -0.06088836 -0.2536156 0.2747388 0.2773945 0.3572772 0.1332619 0.0728696 -0.002162603 -0.2960485 0.3648685 -0.1643449 0.1382677 0.1315332 -0.2055638 0.07308404 0.0775911 0.05101816 -0.1609955 -0.03998369 0.3833353 0.0762384 0.2259601 -0.3533331 0.1723336 -0.1273794 -0.0579498 -0.04028374 -0.2807759 -0.02513879 0.2404649 0.1638513 -0.2971145 0.01142092 -0.1097506 -0.3540099 -0.2323555 -0.151731 0.02238801 -0.1644489 0.2464665 0.01209402 0.07488076 -0.008290177 -0.1148907 -0.1289323 0.2015132 0.1364088 -0.001538895 -0.1333063 0.1076715 0.1780327 -0.1260636 0.3358147 -0.1411725 0.1081913 0.2306425 0.03675358 -0.2819468 0.1171324 0.00843475 0.1094012 -0.1817945 0.09073194 0.2270119 -0.03117535 -0.1676576 -0.2841377 -0.2444807 -0.1583724 0.09714724 0.1997346 -0.1385091 0.02599241 0.2083211 -0.09906653 0.1285685 0.2275843 0.272047 0.08569031 -0.02492276 -0.09767421 0.07894547 -0.1982982 -0.1328524 -0.01622497 -0.03954167 0.06320029 -0.01101624 -0.222086 0.1233027 -0.09254745 -0.2186535 -0.1013665 -0.0359985 0.1449545 0.02803968 -0.02912375 0.1636554 -0.04126236 0.1816033 -0.02329022 0.1719275 0.05906342 0.004998843 -0.08574011 0.1626289 -0.132733 0.2194036 0.2026509 -0.1516984 0.3285471 -0.01439776 0.1683345 -0.1854422 0.02448971 -0.06717207 0.1345156 -0.04313087 -0.3431467 -0.1976715 -0.1164039 0.3548833 -0.05123337 0.205405 ]
S0725  [ 0.8041796 5.94068 -0.8887277 6.294258 -2.406076 -2.882549 3.511944 -0.2957812 -2.881301 -2.190311 -0.08636046 -2.519117 0.8694347 2.786109 0.1098586 0.8748216 0.1324227 0.6397336 0.9138678 -3.644598 -2.416142 3.535862 1.579952 1.184159 -1.537188 1.407604 2.370275 -5.242565 -2.816892 -2.607595 0.4930034 2.480039 1.271463 -1.900119 2.85736 0.7579505 0.5555257 -5.944658 -3.871988 -0.3516358 -4.378802 0.9084898 -2.910641 -1.331581 -2.092254 -3.686688 1.488191 5.053768 0.6904211 1.002361 -0.6862143 0.5013492 -1.48068 -0.7957741 0.3480401 0.2529564 -0.6769273 0.5072885 0.6042187 0.03246704 -1.616165 -0.003808041 0.6318277 -0.07449961 -0.6052511 -0.1681036 1.178499 0.5140572 0.07924429 0.09277374 -0.04609898 -0.473026 0.821262 0.7670045 0.8529429 -0.08135009 0.2607883 0.5047837 -0.6034142 0.4513271 0.0432246 0.4231757 0.4001434 0.5678196 0.5255061 -0.03991098 -0.1794385 -0.6687717 0.4597293 -0.2335239 -0.4026968 -0.2005372 -0.2745363 -0.2405087 0.1719116 0.2940277 -0.2869862 -0.1807654 -0.06652585 0.5426919 0.1690021 -0.3698992 -0.257915 -0.4971139 0.05586409 -0.005944984 0.1588144 0.05333629 0.3704939 -0.6044451 -0.04667811 -0.5552963 0.4823722 0.1958621 -0.07260018 0.3352361 -0.5052102 -0.02079422 0.4317544 -0.1032136 -0.5556011 0.09207463 0.2045453 0.1224866 -0.1346444 0.3177164 0.130081 -0.1433135 0.03923687 -0.2099407 0.1972057 -0.2211918 -0.289142 0.1196337 0.09689581 -0.1696378 -0.2859829 -0.7996486 0.0428399 0.3895903 -0.108304 0.01645086 0.5325287 0.1354358 -0.2319508 0.2415285 -0.1153102 0.1938278 -0.325315 -0.116042 -0.3992225 0.2280821 -0.5746618 0.06443445 -0.2182213 0.3069575 0.4022468 0.2219922 0.4018189 -0.02602194 0.2421206 0.1445634 0.09558339 0.007263102 -0.344185 -0.6733993 0.0003353585 0.2169537 0.03506657 -0.5429401 0.1469321 0.1200037 -0.2797755 0.1757131 -0.3604022 0.3293722 -0.3214694 0.01938171 0.4461701 0.02956007 -0.4175106 0.002300848 0.527434 0.283853 0.4642438 0.2481068 0.231111 -0.002678571 0.08016433 0.1523745 -0.02990462 -0.07880116 -0.103456 -0.1423509 0.2529326 0.2251648 -0.2896209 -0.0295428 -0.1072371 -0.2268201 -0.3058139 0.1187028 -0.1205553 -0.1229838 -0.07447968 0.1943579 -0.1342085 -0.01923033 0.02637444 -0.493792 0.0433018 0.397303 0.1019525 0.005464092 0.07126387 -0.1023972 0.05623214 0.02288585 0.06535462 -0.0354303 -0.4313107 -0.3371484 -0.09296048 -0.06355966 -0.03899284 0.1582913 0.09922784 0.2149033 0.139515 0.1481672 0.2223821 0.3337491 -0.116579 -0.07792707 0.2245804 -0.1954061 -0.03152033 0.006412492 -0.2759017 -0.09991004 0.04058399 -0.2376727 -0.111545 -0.06499203 -0.1128237 0.04170181 0.07870664 0.2807561 -0.2407328 0.06216154 0.1232609 0.03881149 -0.1409105 -0.6106396 0.2001712 0.05155772 -0.2086526 0.1649832 -0.1136075 -0.0101153 -0.07759444 0.2446389 -0.3371444 0.1467948 -0.1506406 0.1093872 0.2175004 0.0786595 0.296654 0.09387305 0.277629 -0.1875827 0.4124308 0.0344426 0.1127761 -0.06170657 0.3204251 -0.05337205 0.1880175 0.3471883 -0.4033144 -0.1266061 0.1426925 -0.02332265 -0.1607591 -0.1535366 0.2280417 0.04479303 0.02889769 -0.1088656 0.1178713 -0.05752108 -0.4668415 0.001221162 -0.1810047 0.05586271 -0.08012377 0.02761767 0.2970267 -0.09274887 0.2588959 0.08028596 0.1634502 -0.02121091 -0.1524414 -0.2171274 -0.08831488 -0.06643203 0.1489908 0.1274579 0.1265518 0.03492879 -0.2730893 -0.1239263 0.1891675 -0.3137865 0.2507259 -0.1369135 0.1598279 0.2403593 0.2587933 0.1178048 0.0457137 -0.307166 -0.1206572 0.05193166 -0.2492081 0.3415496 0.1707307 -0.2951979 -0.1114113 -0.218349 0.050657 -0.1909794 -0.1417466 0.1768075 -0.1277354 -0.2174968 -0.06037929 -0.3430539 0.002913235 0.0684198 0.1796879 0.01486284 0.08621324 -0.008714312 -0.1153829 0.3616968 -0.2983994 -0.4504571 -0.1443657 -0.3296406 0.2083447 -0.08574943 -0.1386682 0.1122162 0.07711552 -0.2683686 -0.1106308 0.02889133 -0.1343029 -0.07115016 0.1620745 -0.01212995 -0.01165778 0.1374442 0.07343705 -0.04005316 -0.003502321 0.01970092 -0.1294793 -0.08542334 0.05443333 0.1664351 0.1160903 -0.1575787 -0.06139877 -0.1823147 -0.103742 -0.04204597 -0.1111197 -0.0933818 -0.3988104 -0.2606748 -0.01516879 0.1474117 -0.02395137 0.2461554 -0.1507015 -0.1248211 0.001748668 -0.2032605 -0.06010894 0.121857 0.2702911 0.04425021 0.06294047 -0.2531649 0.03569878 0.1902132 ]
S0726  [ -1.340571 4.691498 -2.993285 1.920235 4.008678 3.785384 -6.476716 1.654098 -1.074844 5.548283 -0.5517797 -1.565583 1.563615 -0.02729667 4.343427 2.126348 -0.7766336 0.2725935 -1.031017 2.987354 -0.3090956 -1.312063 -4.198803 -2.141245 -3.696198 0.1116055 1.017338 -2.562538 3.494059 -2.867281 -0.5113228 -2.200865 -1.312422 2.250089 -2.675797 2.540892 1.835521 -2.393276 4.312166 -0.2578609 1.963957 -3.104418 -4.946022 -2.559365 0.3120093 -2.87882 2.213581 0.7689951 -0.2024663 1.003667 0.4563357 2.278036 -0.2433449 -0.4259536 -0.8010104 -1.596963 1.230152 -0.6071995 -1.24744 1.181948 1.087675 -0.01530222 -0.828462 -0.2775151 1.776297 -0.104835 -0.09839502 0.2182278 -0.4824146 -0.9133016 0.2472073 0.02435649 0.2804052 -0.8097093 -0.1962302 0.5421208 0.7430857 0.1610352 0.8629059 0.08666219 0.3733602 0.1901667 -0.2630891 0.08525112 1.067609 -0.07115719 -0.5636137 0.6031322 -0.5179823 -0.115532 0.2603432 0.5440969 0.005699251 -0.04578539 -0.1939033 0.3821617 -0.2515148 0.1686436 0.182062 0.2545769 -0.6174753 0.1867031 0.02933959 0.4289897 0.2867047 -0.04903397 0.1971467 -0.1124311 -0.5184212 -0.09997346 -0.152709 0.02524968 -0.4159936 -0.1691817 0.195645 0.210189 -0.0702686 0.1089707 0.3233504 -0.08706411 -0.1153426 0.2903626 0.1779463 -0.08237841 -0.310904 -0.1256424 0.7416076 0.06467074 -0.8405453 -0.4117872 0.09906352 -0.00331719 0.1452024 -0.08190262 -0.1068315 -0.1353912 0.02532914 0.3162935 0.4996234 0.08206555 -0.08366828 0.01807739 -0.05346161 -0.1660512 -0.01636232 0.1951329 0.02820286 0.1838468 0.1029802 0.01625567 -0.1224076 -0.08937152 0.4477914 -0.1005422 0.1601439 -0.08400351 -0.3638651 -0.01489335 0.1085772 0.285939 0.07375979 0.06779338 -0.05135624 -0.3099602 -0.02534228 0.3333646 -0.1332639 -0.2098357 -0.2710572 -0.2629967 -0.1089047 0.1083974 0.06158936 -0.2756806 -0.05032137 0.06327216 -0.04338876 0.06196227 0.108547 -0.3900363 -0.1897604 -0.217908 0.04078824 -0.1405389 0.1902128 0.03140945 -0.05359526 -0.4618537 0.2392075 0.07328679 -0.1651965 0.1214406 -0.3249526 0.005201185 -0.1688347 0.6247766 -0.004709977 0.2219362 0.05677909 0.09456781 -0.1208571 -0.091753 0.01673936 -0.09552711 -0.07130214 -0.4672336 -0.2194615 0.4854264 -0.09539215 0.354326 0.1747648 -0.3306045 -0.1513683 -0.0997763 -0.3555816 0.08428674 0.3245367 -0.2413439 0.2424032 0.2561027 0.06659213 -0.06164191 -0.3369367 0.07145388 -0.2504554 0.1533228 0.07741093 0.06212189 -0.09227102 -0.1335045 -0.1398007 -0.04935808 0.1178666 0.09204834 -0.06958939 0.07082949 -0.001988559 -0.3136062 -0.3910876 -0.3533334 0.1111924 0.02602771 0.1195863 -0.05434574 0.06413475 -0.3141328 0.1001284 -0.2232476 0.398571 0.1928085 -0.02919568 -0.1025953 0.09743194 0.4436168 0.1102299 -0.2582489 0.185295 -0.2110615 0.07914416 0.004930434 -0.3569424 0.22357 -0.07820505 0.1403013 -0.09690109 -0.2313591 0.02116148 0.04271566 0.04106966 0.2722642 -0.1220954 0.01026134 0.02061396 -0.1472459 0.163418 -0.1449603 0.1068102 -0.3402374 -0.2024111 0.1000245 0.09743467 -0.02891107 -0.1421191 0.1008919 0.1994208 -0.4757963 0.01956815 0.129471 -0.07649661 -0.2033038 -0.1620552 0.01348152 -0.08549049 -0.007245569 0.2410901 0.1245441 -0.1624105 0.1819825 -0.1169036 -0.5291017 0.5396571 -0.002323939 0.07588347 -0.168654 -0.2573726 -0.07254156 -0.3947659 -0.06828147 0.2263659 0.09561229 -0.01351237 0.02564162 -0.08341207 0.3699638 -0.1186806 -0.01459215 0.1815215 0.09441543 -0.2547249 -0.008088351 0.08027849 0.04729156 0.1431286 0.05851899 -0.04010045 0.3495248 -0.05698694 0.1876737 -0.009963759 0.04271591 -0.05629878 0.03008218 0.08714964 0.1841383 0.4573995 -0.1442645 0.1620999 -0.3055896 -0.1940916 -0.1181043 -0.2900674 -0.1641857 -0.06665977 -0.08350671 -0.1783914 0.1549092 0.169354 0.2380639 -0.2681069 -0.2300841 0.02572402 -0.2237318 -0.0440267 -0.1852065 0.344095 0.01668577 0.2331298 0.171599 0.2066052 -0.06064308 0.4551559 0.07329386 0.2318103 0.1112986 -0.05322633 0.2539975 0.003321742 0.1328607 0.1397749 -0.1392178 -0.08895784 -0.02799051 -0.03951854 0.02249463 0.03079843 0.02243672 0.08000381 0.002154835 0.176899 0.1237146 0.0139229 0.1453949 0.1870653 0.2821339 0.02763713 -0.2077475 0.2478437 -0.1937061 0.02949828 0.2551794 -0.1168287 0.02534366 0.02826694 -0.0979787 -0.1647054 0.05608033 -0.3463041 0.2184189 0.08276559 0.04314472 ]

每个说话人的i-vector向量是一样长的,都是400,
文献中最终i-vector的长度都是600啊

下面命令转换num_utts.ark成功:
/data/kaldi/src/bin/copy-int-vector ark:num_utts.ark ark,t:num_utts.txt

打开txt文件查看,也有40行,前几行是这样的:

S0724 354
S0725 357
S0726 355
S0727 362
S0728 360
S0729 358

其实num_utts.ark是由dev/spk2utt得到的,估计就是统计了一下每个说话人总计说了多少句话。经验证dev/S0724文件夹下面确实有354个wav文件。

先得到了每一句话的i-vector,再通过取平均(根据每个人说了多少句话)得到了每个说话人的i-vector

#train plda
$train_cmd exp/ivector_train_1024/log/plda.log \ivector-compute-plda ark:data/dev/spk2utt \'ark:ivector-normalize-length scp:exp/ivector_train_1024/ivector.scp  ark:- |' \exp/ivector_train_1024/plda

单引号中的ivector.scp在上一步中是经过了ivector-normalize-length的呀,为什么要再来一遍呢?

这一步是生成plda model,主要输入是:dev/spk2utt,ivector.scp,生成的文件是exp/ivector_train_1024/plda,应该是就是plda model了。

这一步具体的逻辑下面博客好像说的很清楚,但我依然完全不懂,只知道PLDA的基本公式,也不知道哪个对应哪个:
https://blog.csdn.net/liusongxiang666/article/details/83024845
https://blog.csdn.net/zjm750617105/article/details/52832295
https://blog.csdn.net/weixin_38858860/article/details/83960972

使用下面的命令可以转换生成的plda model到txt文件
/data/kaldi/src/ivectorbin/ivector-copy-plda --binary=false plda plda.txt

根据这篇文章:here
plda.txt 有三部分,一个均值vector,一个transform矩阵,也就是 x_ij = u + Fh_i + epsilon 中的F,这里是PLDA的简化版
还有一个epsilon的估计吧
(epsilon默认当然是符合高斯分布的随机变量,对它的估计是均值?)

下面是我google到的回答:

实际应该有四部分吧,下面是从plda.h(源代码在此)中截取的:按照顺序应该是,一个均值向量,一个矩阵,一个表示类间差异的向量,一个offset的向量。这么看怎么又不是简化版的PLDA了呢?

Vector<double> mean_;  // mean of samples in original space.
Matrix<double> transform_; // of dimension Dim() by Dim();// this transform makes within-class covar unit// and diagonalizes the between-class covar.
Vector<double> psi_; // of dimension Dim().  The between-class// (diagonal) covariance elements, in decreasing order.Vector<double> offset_;  // derived variable: -1.0 * transform_ * mean_

2006年的PLDA paper在这里:“Probabilistic Linear Discriminant Analysis” by Sergey Ioffe, ECCV 2006
2007年的Probabilistic Linear Discriminant Analysis for Inferences About Identity
2011年的Analysis of I-vector Length Normalization in Speaker

参考:https://blog.csdn.net/qq_36962569/article/details/88427221


打分阶段应该是根据每条注册语音生成该条测试语音的的Xij,(Xij应该就是i-vector吧?)然后再用plda model生成它的hi,对测试语音也是一样的。然后就计算两个hi的似然度,然后相减。

#split the test to enroll and eval
mkdir -p data/test/enroll data/test/eval
cp data/test/{spk2utt,feats.scp,vad.scp} data/test/enroll
cp data/test/{spk2utt,feats.scp,vad.scp} data/test/eval
local/split_data_enroll_eval.py data/test/utt2spk  data/test/enroll/utt2spk  data/test/eval/utt2spk
trials=data/test/aishell_speaker_ver.lst
local/produce_trials.py data/test/eval/utt2spk $trials
utils/fix_data_dir.sh data/test/enroll
utils/fix_data_dir.sh data/test/eval

这一步主要是整合文件生成真实值。

对于split_data_enroll_eval.py,内部注释如下,也就是说每个说话人只有3句话被用于注册,其它的都被用于eval

# This script splits the test set utt2spk into enroll set and eval set
# For each speaker, 3 utterances are randomly selected as enroll samples,
# and the others are used as eval samples for evaluation
# input: test utt2spk
# output: enroll utt2spk, eval utt2spk

对于produce_trials.py,内部注释如下,

# This script generate trials file.
# Trial file is formatted as:
# uttid spkid target|nontarget# If uttid belong to spkid, it is marked 'target',
# otherwise is 'nontarget'.
# input: eval set uttspk file
# output: trial file

生成的文件是aishell_speaker_ver.lst,打开可以看到,前几行如下,一共有14232行

BAC009S0764W0166 S0764 target
BAC009S0764W0166 S0765 nontarget
BAC009S0764W0166 S0766 nontarget
BAC009S0764W0166 S0767 nontarget

这是整理出来真实值,比如;
音频BAC009S0764W0166.wav确实来自说话人S0764
音频BAC009S0764W0166.wav不是来自说话人S0765

最后两个检查,结果如下,kept all,没有任何改动。

7.

#extract enroll ivector
sid/extract_ivectors.sh --cmd "$train_cmd" --nj 10 \exp/extractor_1024 data/test/enroll  exp/ivector_enroll_1024
#extract eval ivector
sid/extract_ivectors.sh --cmd "$train_cmd" --nj 10 \exp/extractor_1024 data/test/eval  exp/ivector_eval_1024

这一步,对注册语句和测试语句分别提取ivector

这里跟上面第4步是一样的,输入和输出,可直接参考上面第4步。

这一步,输入主要是这三个文件:final.ie ,final.ubm ,feats.scp
生成的文件主要是:ivector.scp,num_utts.ark,spk_ivector.scp
还有log文件。ivector.scp和spk_ivector.scp都是路径文件,前者是说每句话提取的ivector保存在哪里,后者是说每个说话人提取得到的ivector保存在哪里。

对于i-vector的提取公式,M=m+T*w
w是我们要求的,对w进行估计(w是个随机变量,符合高斯分布,对其建模,得到的有w的分布的均值和协方差),取其均值作为每句话的i-vector

final.ie就是T矩阵,是由EM算法得到的,它将 the high dimensional GMM supervector 映射到由 i-vector表示的低维空间。代表了一种映射,一种变换。

m在final.ubm中,是ubm(混合高斯模型)的均值超矢量,独立于说话人和channel

M是由feats.scp和m,经过最大后验概率准则,不断改变m得到的,代表每句话,
M和i-vector都代表这句话,但M包含一句话中的各种信息,比如说话人信息,信道,等等。而i-vector则只包含说话人信息?
参考:https://blog.csdn.net/qq_36962569/article/details/88427221

针对每一帧语音,用最大后验概率MAP去自适应当前句子的GMM模型,只更新均值,
然后形成M个分量的GMM,以每个GMM分量的均值矢量串接,就形成了该帧语音的高斯均值超矢量M

应该是先算帧级别(frame level)的i-vector,再聚合成句子级别(utterence level)的i-vector,再聚合成说话人级别(spk level)的i-vector。至于怎么聚合,可能只是取均值的平均?还是concat均值和标准差?还是用注意力机制?或者用其它神经网络的方法。有的paper中还会提到的segment level,应该是主要指句子级别的吧?

中间显示如下:

8.

#compute plda score
$train_cmd exp/ivector_eval_1024/log/plda_score.log \ivector-plda-scoring --num-utts=ark:exp/ivector_enroll_1024/num_utts.ark \exp/ivector_train_1024/plda \ark:exp/ivector_enroll_1024/spk_ivector.ark \"ark:ivector-normalize-length scp:exp/ivector_eval_1024/ivector.scp ark:- |" \"cat '$trials' | awk '{print \\\$2, \\\$1}' |" exp/trials_out

把exp/ivector_enroll_1024/num_utts.ark转成TXT,打开是这样的:
只有20行,test数据集也只有20个说话人,每人随机挑3句作为注册语音。
提取每句话的ivector之后,平均,生成每个注册说话人的ivector。

S0764 3
S0765 3
S0766 3
S0767 3
S0768 3
S0769 3
S0770 3
S0901 3
S0902 3
S0903 3
S0904 3
S0905 3
S0906 3
S0907 3
S0908 3
S0912 3
S0913 3
S0914 3
S0915 3
S0916 3

打开trials_out,共有14232行,前几行是这个样子:

S0764 BAC009S0764W0166 13.97638
S0765 BAC009S0764W0166 -32.16938
S0766 BAC009S0764W0166 -39.67115
S0767 BAC009S0764W0166 -51.88986
S0768 BAC009S0764W0166 -82.01186

这就是打分结果。比如:
代表说话人S0764的ivector,与,代表语音BAC009S0764W0166.wav的 ivector
进行plda打分的结果是13.97638,这就能说明这段语音来自这个人么?不行的,下面还要分阈值进行EER的寻找。

是每个人单独设一个阈值,还是对所有人都设一个阈值呢?这就需要score nomalization技术,一般是所有人设一个阈值
那么这只是一种方法(i-vector)产生的score,如果用另外一种方法(x-vector)产生的score,如何比较那个score更好呢?就需要分数校准技术

下面要整合真实值与打分结果,也就是整合aishell_speaker_ver.lst和trials_out,可以参考从说话人识别demo开始学习kaldi–(7)EER的计算

ivector-plda-scoring到底是怎么打分的呢?源代码
就是计算两个似然度,然后相减。这个网上能搜到的很多。

#compute eer
awk '{print $3}' exp/trials_out | paste - $trials | awk '{print $1, $4}' | compute-eer -# Result
# Scoring against data/test/aishell_speaker_ver.lst
# Equal error rate is 0.140528%, at threshold -12.018

我的数据结果如下,eer=1.447,阈值为-13.1687
它的数据结果是,eer=1.405,阈值为-12.018
它用的是train数据集,我用的是dev数据集

至于eer是怎么计算的,可以参考下面,还有(7)
https://blog.csdn.net/zjm750617105/article/details/52558779
https://blog.csdn.net/zjm750617105/article/details/60503253

有关awk的使用方法:
https://blog.csdn.net/mosesmo1989/article/details/51093485

从说话人识别demo开始学习kaldi--(6)训练UBM和PLDA相关推荐

  1. 从说话人识别demo开始学习kaldi--(7)EER的计算

    参考这里: https://blog.csdn.net/zjm750617105/article/details/52558779 我的1.txt长这个样子:(可参考local/prepare_for ...

  2. Python 基础 之 jupyter notebook 中机器学习的简单入门书写数字识别 demo 操作学习

    Python 基础 之 jupyter notebook 中机器学习的简单入门书写数字识别 demo 操作学习 目录 Python 基础 之 jupyter notebook 中机器学习的简单入门书写 ...

  3. 训练 ubm模型 matlab语句,基于GMM-UBM的说话人识别 MSR Identity Toolkit

    说话人识别MSR Identity Toolkit 使用微软的声纹识别工具箱,记录使用步骤 该工具箱包含了常规的基于GMM-UBM方法以及state-of-the-art的基于i-vector方法,本 ...

  4. 可视化深度学习模型的训练误差和验证误差

    可视化深度学习模型的训练误差和验证误差 #导入基础包和库 # Load libraries import numpy as np from keras.datasets import imdb fro ...

  5. 【深度学习】模型训练过程可视化思路(可视化工具TensorBoard)

    [深度学习]模型训练过程可视化思路(可视化工具TensorBoard) 文章目录 1 TensorBoard的工作原理 2 TensorFlow中生成log文件 3 启动TensorBoard,读取l ...

  6. 直播预告 | AAAI 2022论文解读:基于对比学习的预训练语言模型剪枝压缩

    「AI Drive」是由 PaperWeekly 和 biendata 共同发起的学术直播间,旨在帮助更多的青年学者宣传其最新科研成果.我们一直认为,单向地输出知识并不是一个最好的方式,而有效地反馈和 ...

  7. 深度学习-Tensorflow2.2-预训练网络{7}-迁移学习基础针对小数据集-19

    使用预训练网络(迁移学习) 预训练网络是一个保存好的之前已在大型数据集(大规模图像分类任务)上训练好的卷积神经网络 如果这个原始数据集足够大且足够通用,那么预训练网络学到的特征的空间层次结构可以作为有 ...

  8. 从 Demo 中学习 Solidity

    从 Demo 中学习 Solidity [注解译文] 前 (全文参考) Solidity官方文档 以太坊白皮书_ZH 以太坊白皮书_EN 发现网上的资料太过琐碎, 惊奇的发现官方有详细的教程, 和例子 ...

  9. 迁移学习实战 | 快速训练残差网络 ResNet-101,完成图像分类与预测,精度高达 98%!...

    作者 | AI 菌 出品 | CSDN博客 头图 | CSDN付费下载自视觉中国 前言 笔者在实现ResNet的过程中,由于电脑性能原因,不得不选择层数较少的ResNet-18进行训练.但是很快发现, ...

最新文章

  1. 【直播】林锦弘:CV赛事高分经验分享
  2. 排序算法---快速排序(java版)
  3. 【美国斯坦福大学人工智能研究院:人工智能当以人为本】
  4. python3 中 is, is not ,==, != 的区别
  5. 博客园是否提供“我参与的主题”功能?
  6. IDEA编译时出现 Information:java: javacTask: 源发行版 1.8 需要目标发行版 1.8
  7. 使用caffe训练时Loss变为nan的原因
  8. django_form校验
  9. 【转】使用JDK自带jvisualvm监控tomcat
  10. 《零基础》MySQL UPDATE 更新(十四)
  11. python元组转换成列表_python-将元组的无序列表转换为pandas DataFrame
  12. Unreal角色技术指南
  13. (转)C#开发微信门户及应用(5)--用户分组信息管理
  14. CUDA Study Notes
  15. GPUImage使用之stillCamera多滤镜
  16. 为什么SSD目标检测算法对小目标检测的效果不好
  17. linux修改dns教程,修改Centos的DNS地址
  18. 数据库安全性相关习题。
  19. 四层七层负载均衡区别
  20. Raspberry-Pi-PICO系列--第八篇 高级篇使用SWD接口下载和调试(第一章)

热门文章

  1. android线程间通信的几种方法_Android进程间和线程间通信方式
  2. 小火狐进化_《乐贝星空》宠物大全 解析小火狐三阶进化
  3. API 每日一图API
  4. 新能源与自动驾驶汽车市场
  5. 2018年cocos2d-x面试题目总结
  6. 2022 学术英语写作(东南大学) 最新Unit 1章节测试答案
  7. Autojs获取GPS定位信息
  8. java植物僵尸_Java小项目之:植物大战僵尸,这个僵尸不太冷!内附素材
  9. 闭上眼睛时,会感触到那清风拂过脸颊的柔
  10. 学习Oracle Applications 相关文档