DL之LSTM:基于《wonderland爱丽丝梦游仙境记》小说数据集利用LSTM算法(基于keras)对word实现预测

目录

基于《wonderland爱丽丝梦游仙境记》小说数据集利用LSTM算法(基于keras)对word实现预测

设计思路

输出结果

核心代码


基于《wonderland爱丽丝梦游仙境记》小说数据集利用LSTM算法(基于keras)对word实现预测

设计思路

更新……

输出结果

rawtext_BySpaceConnect: ALICE'S ADVENTURES IN WONDERLAND  Lewis Carroll  THE MILLENNIUM FULCRUM EDITION 3.0  CHAPTER I. Down the Rabbit-Hole  Alice was beginning to get very tired of sitting by her sister on the bank, and of having nothing to do: once or twice she had peeped into the book her sister was reading, but it had no pictures or conversations in it, 'and what is the use of a book,' thought Alice 'without pictures or conversations?'  So she was considering in her own mind (as well as she could, for the hot day
rawtext2WordLists: ["ALICE'S", 'ADVENTURES', 'IN', 'WONDERLAND', 'Lewis', 'Carroll', 'THE', 'MILLENNIUM', 'FULCRUM', 'EDITION', '3.0', 'CHAPTER', 'I', 'Down', 'the', 'Rabbit-Hole', 'Alice', 'was', 'beginning', 'to', 'get', 'very', 'tired', 'of', 'sitting', 'by', 'her', 'sister', 'on', 'the', 'bank', 'and', 'of', 'having', 'nothing', 'to', 'do', 'once', 'or', 'twice', 'she', 'had', 'peeped', 'into', 'the', 'book', 'her', 'sister', 'was', 'reading', 'but', 'it', 'had', 'no', 'pictures', 'or', 'conversations', 'in', 'it', 'and', 'what', 'is', 'the', 'use', 'of', 'a', 'book', 'thought', 'Alice', 'without', 'pictures', 'or', 'conversations', 'So', 'she', 'was', 'considering', 'in', 'her', 'own', 'mind', 'as', 'well', 'as', 'she', 'could', 'for', 'the', 'hot', 'day', 'made', 'her', 'feel', 'very', 'sleepy', 'and', 'stupid', 'whether', 'the', 'pleasure', 'of', 'making', 'a', 'daisy-chain', 'would', 'be', 'worth', 'the', 'trouble', 'of', 'getting', 'up', 'and', 'picking', 'the', 'daisies', 'when', 'suddenly', 'a', 'White', 'Rabbit', 'with', 'pink', 'eyes', 'ran', 'close', 'by', 'her', 'There', 'was', 'nothing', 'so', 'VERY', 'remarkable', 'in', 'that', 'nor', 'did', 'Alice', 'think', 'it', 'so', 'VERY', 'much', 'out', 'of', 'the', 'way', 'to', 'hear', 'the', 'Rabbit', 'say', 'to', 'itself', 'Oh', 'dear', 'Oh', 'dear', 'I', 'shall', 'be', 'late', 'when', 'she', 'thought', 'it', 'over', 'afterwards', 'it', 'occurred', 'to', 'her', 'that', 'she', 'ought', 'to', 'have', 'wondered', 'at', 'this', 'but', 'at', 'the', 'time', 'it', 'all', 'seemed', 'quite', 'natural', 'but', 'when', 'the', 'Rabbit', 'actually', 'TOOK', 'A', 'WATCH', 'OUT', 'OF', 'ITS', 'WAISTCOAT-POCKET', 'and', 'looked', 'at', 'it', 'and', 'then', 'hurried', 'on', 'Alice', 'started', 'to', 'her', 'feet', 'for', 'it', 'flashed', 'across', 'her', 'mind', 'that', 'she', 'had', 'never', 'before', 'seen', 'a', 'rabbit', 'with', 'either', 'a', 'waistcoat-pocket', 'or', 'a', 'watch', 'to', 'take', 'out', 'of', 'it', 'and', 'burning', 'with', 'curiosity', 'she', 'ran', 'across', 'the', 'field', 'after', 'it', 'and', 'fortunately', 'was', 'just', 'in', 'time', 'to', 'see', 'it', 'pop', 'down', 'a', 'large', 'rabbit-hole', 'under', 'the', 'hedge', 'In', 'another', 'moment', 'down', 'went', 'Alice', 'after', 'it', 'never', 'once', 'considering', 'how', 'in', 'the', 'world', 'she', 'was', 'to', 'get', 'out', 'again', 'The', 'rabbit-hole', 'went', 'straight', 'on', 'like', 'a', 'tunnel', 'for', 'some', 'way', 'and', 'then', 'dipped', 'suddenly', 'down', 'so', 'suddenly', 'that', 'Alice', 'had', 'not', 'a', 'moment', 'to', 'think', 'about', 'stopping', 'herself', 'before', 'she', 'found', 'herself', 'falling', 'down', 'a', 'very', 'deep', 'well', 'Either', 'the', 'well', 'was', 'very', 'deep', 'or', 'she', 'fell', 'very', 'slowly', 'for', 'she', 'had', 'plenty', 'of', 'time', 'as', 'she', 'went', 'down', 'to', 'look', 'about', 'her', 'and', 'to', 'wonder', 'what', 'was', 'going', 'to', 'happen', 'next', 'First', 'she', 'tried', 'to', 'look', 'down', 'and', 'make', 'out', 'what', 'she', 'was', 'coming', 'to', 'but', 'it', 'was', 'too', 'dark', 'to', 'see', 'anything', 'then', 'she', 'looked', 'at', 'the', 'sides', 'of', 'the', 'well', 'and', 'noticed', 'that', 'they', 'were', 'filled', 'with', 'cupboards', 'and', 'book-shelves', 'here', 'and', 'there', 'she', 'saw', 'maps', 'and', 'pictures', 'hung', 'upon', 'pegs', 'She', 'took', 'down', 'a', 'jar', 'from', 'one', 'of', 'the', 'shelves', 'as', 'she', 'passed', 'it', 'was', 'labelled', 'ORANGE', 'MARMALADE', 'but', 'to', 'her', 'great', 'disappointment', 'it', 'was', 'empty', 'she', 'did', 'not', 'like', 'to', 'drop', 'the', 'jar', 'for', 'fear', 'of', 'killing', 'somebody', 'so', 'managed', 'to', 'put', 'it', 'into', 'one', 'of', 'the', 'cupboards', 'as', 'she', 'fell', 'past', 'it', 'Well', 'thought', 'Alice', 'to', 'herself', 'after', 'such', 'a', 'fall', 'as', 'this', 'I', 'shall', 'think', 'nothing', 'of', 'tumbling', 'down', 'stairs', 'How', 'brave', "they'll", 'all', 'think', 'me', 'at', 'home', 'Why', 'I', "wouldn't", 'say']
rawtext_BySpace: ALICE'S ADVENTURES IN WONDERLAND Lewis Carroll THE MILLENNIUM FULCRUM EDITION 3.0 CHAPTER I Down the Rabbit Hole Alice was beginning to get very tired of sitting by her sister on the bank and of having nothing to do once or twice she had peeped into the book her sister was reading but it had no pictures or conversations in it and what is the use of a book thought Alice without pictures or conversations So she was considering in her own mind as well as she could for the hot day made her feel very
words_num: 26694
vocab_num: 3063
dataX: 26594 100 [[19, 18, 238, 547, 278, 84, 469, 294, 160, 133, 16, 74, 227, 125, 2713, 393, 223, 31, 2932, 769, 2773, 1456, 2905, 2770, 2006, 2500, 862, 1569, 2495, 2019, 2713, 733, 660, 2006, 1543, 1988, 2773, 1144, 2020, 2035, 2841, 2434, 1513, 2091, 1663, 2713, 810, 1569, 2495, 2932, 2258, 856, 1675, 1513, 1977, 2111, 2035, 1006, 1640, 1675, 660, 2960, 1673, 2713, 2886, 2006, 594, 810, 2741, 31, 3004, 2111, 2035, 1006, 440, 2434, 2932, 996, 1640, 1569, 2051, 1897, 701, 2954, 701, 2434, 1012, 1402, 2713, 1603, 1083, 1847, 1569, 1328, 2905, 2513, 660, 2637, 2969, 2713], [18, 238, 547, 278, 84, 469, 294, 160, 133, 16, 74, 227, 125, 2713, 393, 223, 31, 2932, 769, 2773, 1456, 2905, 2770, 2006, 2500, 862, 1569, 2495, 2019, 2713, 733, 660, 2006, 1543, 1988, 2773, 1144, 2020, 2035, 2841, 2434, 1513, 2091, 1663, 2713, 810, 1569, 2495, 2932, 2258, 856, 1675, 1513, 1977, 2111, 2035, 1006, 1640, 1675, 660, 2960, 1673, 2713, 2886, 2006, 594, 810, 2741, 31, 3004, 2111, 2035, 1006, 440, 2434, 2932, 996, 1640, 1569, 2051, 1897, 701, 2954, 701, 2434, 1012, 1402, 2713, 1603, 1083, 1847, 1569, 1328, 2905, 2513, 660, 2637, 2969, 2713, 2144], [238, 547, 278, 84, 469, 294, 160, 133, 16, 74, 227, 125, 2713, 393, 223, 31, 2932, 769, 2773, 1456, 2905, 2770, 2006, 2500, 862, 1569, 2495, 2019, 2713, 733, 660, 2006, 1543, 1988, 2773, 1144, 2020, 2035, 2841, 2434, 1513, 2091, 1663, 2713, 810, 1569, 2495, 2932, 2258, 856, 1675, 1513, 1977, 2111, 2035, 1006, 1640, 1675, 660, 2960, 1673, 2713, 2886, 2006, 594, 810, 2741, 31, 3004, 2111, 2035, 1006, 440, 2434, 2932, 996, 1640, 1569, 2051, 1897, 701, 2954, 701, 2434, 1012, 1402, 2713, 1603, 1083, 1847, 1569, 1328, 2905, 2513, 660, 2637, 2969, 2713, 2144, 2006], [547, 278, 84, 469, 294, 160, 133, 16, 74, 227, 125, 2713, 393, 223, 31, 2932, 769, 2773, 1456, 2905, 2770, 2006, 2500, 862, 1569, 2495, 2019, 2713, 733, 660, 2006, 1543, 1988, 2773, 1144, 2020, 2035, 2841, 2434, 1513, 2091, 1663, 2713, 810, 1569, 2495, 2932, 2258, 856, 1675, 1513, 1977, 2111, 2035, 1006, 1640, 1675, 660, 2960, 1673, 2713, 2886, 2006, 594, 810, 2741, 31, 3004, 2111, 2035, 1006, 440, 2434, 2932, 996, 1640, 1569, 2051, 1897, 701, 2954, 701, 2434, 1012, 1402, 2713, 1603, 1083, 1847, 1569, 1328, 2905, 2513, 660, 2637, 2969, 2713, 2144, 2006, 1851], [278, 84, 469, 294, 160, 133, 16, 74, 227, 125, 2713, 393, 223, 31, 2932, 769, 2773, 1456, 2905, 2770, 2006, 2500, 862, 1569, 2495, 2019, 2713, 733, 660, 2006, 1543, 1988, 2773, 1144, 2020, 2035, 2841, 2434, 1513, 2091, 1663, 2713, 810, 1569, 2495, 2932, 2258, 856, 1675, 1513, 1977, 2111, 2035, 1006, 1640, 1675, 660, 2960, 1673, 2713, 2886, 2006, 594, 810, 2741, 31, 3004, 2111, 2035, 1006, 440, 2434, 2932, 996, 1640, 1569, 2051, 1897, 701, 2954, 701, 2434, 1012, 1402, 2713, 1603, 1083, 1847, 1569, 1328, 2905, 2513, 660, 2637, 2969, 2713, 2144, 2006, 1851, 594]]
dataY: 26594 [2144, 2006, 1851, 594, 1074]
Total patterns: 26594
X_train.shape (26594, 100, 1)
Y_train.shape (26594, 3063)
_________________________________________________________________
Layer (type)                 Output Shape              Param #
=================================================================
lstm_1 (LSTM)                (None, 256)               264192
_________________________________________________________________
dropout_1 (Dropout)          (None, 256)               0
_________________________________________________________________
dense_1 (Dense)              (None, 3063)              787191
=================================================================
Total params: 1,051,383
Trainable params: 1,051,383
Non-trainable params: 0
_________________________________________________________________
LSTM_Model None……Epoch 00005: loss improved from 6.26403 to 6.26198, saving model to hdf5/word-weights-improvement-05-6.2620.hdf5
Epoch 6/10128/26594 [..............................] - ETA: 2:09 - loss: 6.8378256/26594 [..............................] - ETA: 2:06 - loss: 6.4136384/26594 [..............................] - ETA: 2:01 - loss: 6.3299512/26594 [..............................] - ETA: 1:57 - loss: 6.4469640/26594 [..............................] - ETA: 1:57 - loss: 6.4133……Epoch 00008: loss improved from 6.25725 to 6.25487, saving model to hdf5/word-weights-improvement-08-6.2549.hdf5
Epoch 9/10128/26594 [..............................] - ETA: 1:57 - loss: 6.2336256/26594 [..............................] - ETA: 2:02 - loss: 6.1897384/26594 [..............................] - ETA: 2:04 - loss: 6.3229512/26594 [..............................] - ETA: 2:01 - loss: 6.3550640/26594 [..............................] - ETA: 2:02 - loss: 6.3279768/26594 [..............................] - ETA: 2:05 - loss: 6.2614896/26594 [>.............................] - ETA: 2:06 - loss: 6.24331024/26594 [>.............................] - ETA: 2:07 - loss: 6.2477
……25216/26594 [===========================>..] - ETA: 6s - loss: 6.2456
25344/26594 [===========================>..] - ETA: 6s - loss: 6.2469
25472/26594 [===========================>..] - ETA: 5s - loss: 6.2477
25600/26594 [===========================>..] - ETA: 4s - loss: 6.2486
25728/26594 [============================>.] - ETA: 4s - loss: 6.2480
25856/26594 [============================>.] - ETA: 3s - loss: 6.2483
25984/26594 [============================>.] - ETA: 2s - loss: 6.2487
26112/26594 [============================>.] - ETA: 2s - loss: 6.2485
26240/26594 [============================>.] - ETA: 1s - loss: 6.2483
26368/26594 [============================>.] - ETA: 1s - loss: 6.2482
26496/26594 [============================>.] - ETA: 0s - loss: 6.2485
26594/26594 [==============================] - 129s 5ms/step - loss: 6.2499Epoch 00009: loss improved from 6.25487 to 6.24987, saving model to hdf5/word-weights-improvement-09-6.2499.hdf5
Epoch 10/10128/26594 [..............................] - ETA: 1:56 - loss: 6.4864256/26594 [..............................] - ETA: 2:04 - loss: 6.2577384/26594 [..............................] - ETA: 2:07 - loss: 6.2857512/26594 [..............................] - ETA: 2:10 - loss: 6.3230
……25856/26594 [============================>.] - ETA: 3s - loss: 6.2426
25984/26594 [============================>.] - ETA: 3s - loss: 6.2447
26112/26594 [============================>.] - ETA: 2s - loss: 6.2446
26240/26594 [============================>.] - ETA: 1s - loss: 6.2449
26368/26594 [============================>.] - ETA: 1s - loss: 6.2467
26496/26594 [============================>.] - ETA: 0s - loss: 6.2461
26594/26594 [==============================] - 135s 5ms/step - loss: 6.2465Epoch 00010: loss improved from 6.24987 to 6.24646, saving model to hdf5/word-weights-improvement-10-6.2465.hdf5
LSTM_Pre_word.shape: (3, 3063)LSTM_Model,Seed:
" cheerfully he seems to grin How neatly spread his claws And welcome little fishes in With gently smiling jaws I'm sure those are not the right words said poor Alice and her eyes filled with tears again as she went on I must be Mabel after all and I shall have to go and live in that poky little house and have next to no toys to play with and oh ever so many lessons to learn No I've made up my mind about it if I'm Mabel I'll stay down here It'll be no use their putting their heads "
199 100Generated Sequence:
the the the the the the the the the the the the the the the the the the the the the the the the the the the the the the the the the the the the the the the the the the the the the the the the the the the the the the the the the the the the the the the the the the the the the the the the the the the the the the the the the the the the the the the the the the the the the the the the the the the the the the the the the the the the the the the the the the the the the the the the the the the the the the the the the the the the the the the the the the the the the the the the the the the the the the the the the the the the the the the the the the the the the the the the the the the the the the the the the the the the the the the the the the the the the the the the the the the the the the the theDone.

核心代码

LSTM_Model = Sequential()
LSTM_Model.add(LSTM(256, input_shape=(X_train.shape[1], X_train.shape[2])))
LSTM_Model.add(Dropout(0.2))
LSTM_Model.add(Dense(Y_train.shape[1], activation='softmax'))
LSTM_Model.compile(loss='categorical_crossentropy', optimizer='adam')
print('LSTM_Model \n',LSTM_Model.summary())

DL之LSTM:基于《wonderland爱丽丝梦游仙境记》小说数据集利用LSTM算法(基于keras)对word实现预测相关推荐

  1. DL之LSTM:基于《wonderland爱丽丝梦游仙境记》小说数据集利用LSTM算法(层加深,基于keras)对单个character字符预测

    DL之LSTM:基于<wonderland爱丽丝梦游仙境记>小说数据集利用LSTM算法(层加深,基于keras)对单个character字符预测 目录 基于<wonderland爱丽 ...

  2. ML之catboost:基于自定义数据集利用catboost 算法实现回归预测(训练采用CPU和GPU两种方式)

    ML之catboost:基于自定义数据集利用catboost 算法实现回归预测(训练采用CPU和GPU两种方式) 目录 基于自定义数据集利用catboost 算法实现回归预测(训练采用CPU和GPU两 ...

  3. ML之K-means:基于DIY数据集利用K-means算法聚类(测试9种不同聚类中心的模型性能)

    ML之K-means:基于DIY数据集利用K-means算法聚类(测试9种不同聚类中心的模型性能) 目录 输出结果 设计思路 实现代码 输出结果 设计思路 1.使用均匀分布函数随机三个簇,每个簇周围1 ...

  4. ML之K-means:基于(完整的)手写数字图片识别数据集利用K-means算法实现图片聚类

    ML之K-means:基于(完整的)手写数字图片识别数据集利用K-means算法实现图片聚类 目录 输出结果 设计思路 核心代码 输出结果 设计思路 核心代码 metrics.adjusted_ran ...

  5. ML之FE:基于波士顿房价数据集利用LightGBM算法进行模型预测然后通过3σ原则法(计算残差标准差)寻找测试集中的异常值/异常样本

    ML之FE:基于波士顿房价数据集利用LightGBM算法进行模型预测然后通过3σ原则法(计算残差标准差)寻找测试集中的异常值/异常样本 目录 基于波士顿房价数据集利用LiR和LightGBM算法进行模 ...

  6. ML之CatboostC:基于titanic泰坦尼克数据集利用catboost算法实现二分类

    ML之CatboostC:基于titanic泰坦尼克数据集利用catboost算法实现二分类 目录 基于titanic泰坦尼克数据集利用catboost算法实现二分类 设计思路 输出结果 核心代码 相 ...

  7. 【项目实战】Python基于Lasso特征选择、GM算法和SVR回归算法进行财政收入影响因素分析及预测

    说明:这是一个机器学习实战项目(附带数据+代码+文档+视频讲解),如需数据+代码+文档+视频讲解可以直接到文章最后获取. 1.项目背景 随着信息化的发展和科学技术的进步,数据分析与挖掘技术开始得到广泛 ...

  8. DL之CNN可视化:利用SimpleConvNet算法【3层,im2col优化】基于mnist数据集训练并对卷积层输出进行可视化

    DL之CNN可视化:利用SimpleConvNet算法[3层,im2col优化]基于mnist数据集训练并对卷积层输出进行可视化 导读 利用SimpleConvNet算法基于mnist数据集训练并对卷 ...

  9. DL之GRU:基于2022年6月最新上证指数数据集结合Pytorch框架利用GRU算法预测最新股票上证指数实现回归预测

    DL之GRU:基于2022年6月最新上证指数数据集结合Pytorch框架利用GRU算法预测最新股票上证指数实现回归预测 目录 基于2022年6月最新上证指数数据集结合Pytorch框架利用GRU算法预 ...

最新文章

  1. RStudio环境或者ggsave函数保存生成的图像为指定文件格式(pdf、jpeg、tiff、png、svg、wmf)、指定图像宽度、高度、分辨率(width、height、dpi)
  2. tomcat历史版本新特性_tomcat升级后报错RFC 7230 and RFC 3986
  3. SAP PM 入门系列15 - IW41 维护订单确认
  4. 安装与配置windbg的symbol(符号)
  5. Java Date 和 Calendar
  6. GPU虚拟化时代的到来(vGPU)!CitrixNVIDIA联合研发虚拟化共享GPU技术预览
  7. oracle材料差异科目,ORACLE分科目统计每科前三名的学生的语句
  8. 思维探索者:从问题到答案的思维过程 像侦探一样思考
  9. iframe框架及优缺点
  10. H5倍速播放视频播放器(2x/1.5x/0.5x播放)
  11. Transformer(李宏毅2022)
  12. 22条创业军规(读书)
  13. 计算机网络码片序列计算问题
  14. Python从入门到实践:打包和解包(*和**)的使用
  15. mysql生产cdm文件_PowerDesigner 概念数据模型(CDM) 说明
  16. IPguard客户端安装步骤
  17. C++求1到10这10个数之和
  18. 虚拟化技术详解——少年想自己做个虚拟机吗?
  19. Vue实现歌词解析+滚动效果
  20. ANSYS SIwave 基于S参数模型的信号完整性仿真论文

热门文章

  1. oracle最新scn补丁,Oracle 系统改变号(SCN), Headroom, 安全和补丁信息
  2. 《持续交付》读书笔记
  3. JQData | 基于聚宽数据JQData的沪深300股指期货贴水现象简析
  4. 收藏!纯净windows系统镜像下载网站
  5. 感觉 C++ 很简单,但为何这么多劝退的?
  6. 在Windows命令行工具cmd中使用gcc命令实现编译
  7. Facebook黄毅:职业路上知难而“进”
  8. 工业镜头在检测中的作用
  9. python中输入汉字_python输入中文的实例方法
  10. win10或者win11老是出现Amd的驱动问题 网上说的都是win10或者win11会自动回滚驱动