该文件是一个文本分类的示例,

数据集采用路透社的新闻,我们可以把其内容解析一下:

word_index = reuters.get_word_index()
word_index = {k: (v + 3) for k, v in word_index.items()}
word_index["<PAD>"] = 0
word_index["<START>"] = 1
word_index["?"] = 2  # unknown
word_index["<UNUSED>"] = 3reverse_word_index = dict([(value, key) for (key, value) in word_index.items()])def decode_review(text):return ' '.join([reverse_word_index.get(i, '?') for i in text])for i in range(10):print(decode_review(x_train[i]))print(y_train[i])

就可以看到类似这样的新闻原文:

<START> ? ? said as a result of its december acquisition of ? co it expects earnings per share in 1987 of 1 15 to 1 30 dlrs per share up from 70 cts in 1986 the company said pretax net should rise to nine to 10 mln dlrs from six mln dlrs in 1986 and ? ? revenues to 19 to 22 mln dlrs from 12 5 mln dlrs it said cash ? per share this year should be 2 50 to three dlrs reuter 3

上面的3,是因为 0,1,2,3保留下标,分别表示:“padding,” “start of sequence,“,“unknown.”和“unused”

然后下面采用 Tokenizer 对其进行编码,我们添加一些打印,看下编码前后的关系:

tmp_list = x_train[0]
tmp_list.sort()
print(tmp_list)
print('Vectorizing sequence data...')
tokenizer = Tokenizer(num_words=max_words)
x_train = tokenizer.sequences_to_matrix(x_train, mode='binary')
x_test = tokenizer.sequences_to_matrix(x_test, mode='binary')
print(x_train[0])

打印结果为:

[1, 2, 2, 2, 2, 2, 2, 4, 5, 5, 5, 6, 6, 6, 6, 6, 6, 7, 7, 7, 8, 8, 8, 9, 10, 11, 11, 11, 11, 12, 15, 15, 15, 15, 15, 15, 16, 16, 17, 19, 19, 22, 22, 22, 25, 26, 29, 30, 32, 39, 43, 44, 48, 48, 49, 52, 67, 67, 67, 83, 84, 89, 90, 90, 90, 102, 109, 111, 124, 132, 134, 151, 154, 155, 186, 197, 207, 209, 209, 258, 270, 272, 369, 447, 482, 504, 864]
Vectorizing sequence data...
[0. 1. 1. 0. 1. 1. 1. 1. 1. 1. 1. 1. 1. 0. 0. 1. 1. 1. 0. 1. 0. 0. 1. 0.0. 1. 1. 0. 0. 1. 1. 0. 1. 0. 0. 0. 0. 0. 0. 1. 0. 0. 0. 1. 1. 0. 0. 0.1. 1. 0. 0. 1. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 1. 0. 0. 0. 0.0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 1. 1. 0. 0. 0. 0. 1. 1. 0. 0. 0. 0. 0.0. 0. 0. 0. 0. 0. 1. 0. 0. 0. 0. 0. 0. 1. 0. 1. 0. 0. 0. 0. 0. 0. 0. 0.0. 0. 0. 0. 1. 0. 0. 0. 0. 0. 0. 0. 1. 0. 1. 0. 0. 0. 0. 0. 0. 0. 0. 0.0. 0. 0. 0. 0. 0. 0. 1. 0. 0. 1. 1. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0.0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 1. 0. 0. 0. 0. 0.0. 0. 0. 0. 0. 1. 0. 0. 0. 0. 0. 0. 0. 0. 0. 1. 0. 1. 0. 0. 0. 0. 0. 0.0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0.0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 1. 0. 0. 0. 0. 0.0. 0. 0. 0. 0. 0. 1. 0. 1. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0.0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0.0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0.0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0.0. 0. 0. 0. 0. 0. 0. 0. 0. 1. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0.0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0.0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0.0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 1. 0. 0. 0. 0. 0. 0. 0. 0.0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0.0. 0. 1. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0.1. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0.0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0.0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0.0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0.0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0.0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0.0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0.0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0.0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0.0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0.0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0.0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0.0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0.0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0.0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0.1. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0.0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0.0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0.0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0.0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0.0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0.]

也就是如果文本中包含该字符,则将相应的位置1,否则置0,不过我们也看到,此种编码方式会丢失文本的位置信息;也就是“我要狗”跟“狗咬我”没有区别了,不过好处是节省空间,比起Embedding编码,空间缩小至少一个数量级

编码后的shape为:

x_train shape: (8982, 1000)
y_train shape: (8982, 46)

也就是46种分类,通过一个神经网络来进行分类:

_________________________________________________________________
Layer (type)                 Output Shape              Param #
=================================================================
dense_1 (Dense)              (None, 512)               512512
_________________________________________________________________
activation_1 (Activation)    (None, 512)               0
_________________________________________________________________
dropout_1 (Dropout)          (None, 512)               0
_________________________________________________________________
dense_2 (Dense)              (None, 46)                23598
_________________________________________________________________
activation_2 (Activation)    (None, 46)                0
=================================================================
Total params: 536,110
Trainable params: 536,110
Non-trainable params: 0
_________________________________________________________________

代码 reuters_mlp_relu_vs_selu.py 使用的是同样的数据集,实现的功能就是比较一下两种神经网络,哪种效果好,也就是激活函数不同,dropout不一样,初始化参数方式有点区别,这里仅打印一下神经网络:

network1

_________________________________________________________________
Layer (type)                 Output Shape              Param #
=================================================================
dense_1 (Dense)              (None, 16)                16016
_________________________________________________________________
activation_1 (Activation)    (None, 16)                0
_________________________________________________________________
dropout_1 (Dropout)          (None, 16)                0
_________________________________________________________________
dense_2 (Dense)              (None, 16)                272
_________________________________________________________________
activation_2 (Activation)    (None, 16)                0
_________________________________________________________________
dropout_2 (Dropout)          (None, 16)                0
_________________________________________________________________
dense_3 (Dense)              (None, 16)                272
_________________________________________________________________
activation_3 (Activation)    (None, 16)                0
_________________________________________________________________
dropout_3 (Dropout)          (None, 16)                0
_________________________________________________________________
dense_4 (Dense)              (None, 16)                272
_________________________________________________________________
activation_4 (Activation)    (None, 16)                0
_________________________________________________________________
dropout_4 (Dropout)          (None, 16)                0
_________________________________________________________________
dense_5 (Dense)              (None, 16)                272
_________________________________________________________________
activation_5 (Activation)    (None, 16)                0
_________________________________________________________________
dropout_5 (Dropout)          (None, 16)                0
_________________________________________________________________
dense_6 (Dense)              (None, 16)                272
_________________________________________________________________
activation_6 (Activation)    (None, 16)                0
_________________________________________________________________
dropout_6 (Dropout)          (None, 16)                0
_________________________________________________________________
dense_7 (Dense)              (None, 46)                782
_________________________________________________________________
activation_7 (Activation)    (None, 46)                0
=================================================================
Total params: 18,158
Trainable params: 18,158
Non-trainable params: 0
_________________________________________________________________

network2

________________________________________________________________________________
Layer (type)                        Output Shape                    Param #
================================================================================
dense_1 (Dense)                     (None, 16)                      16016
________________________________________________________________________________
activation_1 (Activation)           (None, 16)                      0
________________________________________________________________________________
alpha_dropout_1 (AlphaDropout)      (None, 16)                      0
________________________________________________________________________________
dense_2 (Dense)                     (None, 16)                      272
________________________________________________________________________________
activation_2 (Activation)           (None, 16)                      0
________________________________________________________________________________
alpha_dropout_2 (AlphaDropout)      (None, 16)                      0
________________________________________________________________________________
dense_3 (Dense)                     (None, 16)                      272
________________________________________________________________________________
activation_3 (Activation)           (None, 16)                      0
________________________________________________________________________________
alpha_dropout_3 (AlphaDropout)      (None, 16)                      0
________________________________________________________________________________
dense_4 (Dense)                     (None, 16)                      272
________________________________________________________________________________
activation_4 (Activation)           (None, 16)                      0
________________________________________________________________________________
alpha_dropout_4 (AlphaDropout)      (None, 16)                      0
________________________________________________________________________________
dense_5 (Dense)                     (None, 16)                      272
________________________________________________________________________________
activation_5 (Activation)           (None, 16)                      0
________________________________________________________________________________
alpha_dropout_5 (AlphaDropout)      (None, 16)                      0
________________________________________________________________________________
dense_6 (Dense)                     (None, 16)                      272
________________________________________________________________________________
activation_6 (Activation)           (None, 16)                      0
________________________________________________________________________________
alpha_dropout_6 (AlphaDropout)      (None, 16)                      0
________________________________________________________________________________
dense_7 (Dense)                     (None, 46)                      782
________________________________________________________________________________
activation_7 (Activation)           (None, 46)                      0
================================================================================
Total params: 18,158
Trainable params: 18,158
Non-trainable params: 0
________________________________________________________________________________

keras 的 example 文件 reuters_mlp.py 解析相关推荐

  1. keras 的 example 文件 cnn_seq2seq.py 解析

    该代码是实现一个翻译功能,好像是英语翻译为法语,嗯,我看不懂法语 首先这个代码有一个bug,本人提交了一个pull request来修复, https://github.com/keras-team/ ...

  2. keras 的 example 文件 cifar10_resnet.py 解析

    该代码功能是卷积神经网络进行图像识别,数据集是cifar10 同时演示了回调函数 ModelCheckpoint, LearningRateScheduler, ReduceLROnPlateau 的 ...

  3. keras 的 example 文件 babi_rnn.py 解析

    该代码的目的和 https://blog.csdn.net/zhqh100/article/details/105193991 类似 数据集也是同一个数据集,只不过这个是从 qa2_two-suppo ...

  4. keras 的 example 文件 mnist_hierarchical_rnn.py 解析

    很显然,我没有看懂 HRNN 是啥意思,没有去看论文,应该就是一种RNN结构的变形吧 网络结构如下: _________________________________________________ ...

  5. keras 的 example 文件 mnist_denoising_autoencoder.py 解析

    mnist_denoising_autoencoder.py 是一个编解码神经网络,其意义就是如果图片中有噪点的话,可以去除噪点,还原图片 其编码网络为: ______________________ ...

  6. keras 的 example 文件 mnist_cnn.py 解析

    mnist_cnn.py 基本上就是最简单的一个卷积神经网络了,其结构如下: _____________________________________________________________ ...

  7. keras 的 example 文件 imdb_bidirectional_lstm.py 解析

    imdb是一个文本情感分析的数据集,通过评论来分析观众对电影是好评还是差评 其网络结构比较简单 ____________________________________________________ ...

  8. keras 的 example 文件 lstm_text_generation.py 解析

    该程序是学习现有的文章,然后学习预测下个字符,这样一个字符一个字符的学会写文章 先打印下char_indices {'\n': 0, ' ': 1, '!': 2, '"': 3, &quo ...

  9. keras 的 example 文件 lstm_stateful.py 解析

    该程序要通过一个LSTM来实现拟合窗口平均数的功能 先看输入输出数据, print(x_train[:10]) [[[-0.08453234]][[ 0.02169589]][[ 0.07949955 ...

最新文章

  1. IOS开发—Using UIGestureRecognizer with Swift Tutoria
  2. 报班学python到底怎么样-学Python真的能让你找到出路吗?
  3. 2015年最值得学习的编程语言是?
  4. ajax连接云数据库密码,ajax和数据库连接
  5. Linux操作系统总结
  6. JAVA多线程和并发基础面试问答(转载)
  7. 怎样查一个文件被复制了几次_复制拷贝文件不怕再出错,一个超级好用的小工具,支持多线程工作...
  8. 10.iterm 2 install rz , sz
  9. 神策分析 1.13 版本用户分群全新升级,为企业打磨­­­­­­­­­精细化运营基石
  10. Redis 命令 - 在线参考
  11. 合宙Air724UG AT指令连接阿里云
  12. 华信分享-网站优化关键词密度多少才是最合适的?
  13. d3.js v5 数据加载
  14. PHP中的定界符 echo
  15. 信通院郭雪:软件供应链安全标准体系建设与洞察
  16. 微信小程序直播服务器是用的腾讯的,使用微信小程序和腾讯云实现直播功能
  17. 黑猫带你学eMMC协议第25篇:eMMC命令队列详解(CMDQ)
  18. inet_aton函数
  19. Pytorch的22个激活函数
  20. 讯飞语音转文字_录音实时转文字就是如此简单 讯飞智能录音笔SR701评测

热门文章

  1. Collections.addAll() 的使用 以及和list.addAll() 的区别
  2. js 点击改变内容与vue 点击改变内容
  3. python笔记6 模块与包 程序开发规范 包 re sys time os模块
  4. 小乐乐打游戏(BFS+曼哈顿距离)
  5. Java Websocket实例【服务端与客户端实现全双工通讯】
  6. 进军ABP第一天:ABP理论知识
  7. shell 边边角角
  8. centos7grub2 引导win10
  9. Swift 中 insetBy(dx: CGFloat, dy: CGFloat) - CGRect 用法详解
  10. http请求过程简要