目录(?)[+]

“什么时候能自动生成博客?”

前言

跳过废话,直接看正文

RNN相对于传统的神经网络来说对于把握上下文之间的关系更为擅长,因此现在被大量用在自然语言处理的相关任务中,例如生成与训练文集相似的文字、序列标注、中文分词等。

此文列出两种基于RNN的文本生成算法,以供参考。


正文

基于字符的文本生成算法

此代码为keras的官方例子

'''Example script to generate text from Nietzsche's writings.
At least 20 epochs are required before the generated text
starts sounding coherent.
It is recommended to run this script on GPU, as recurrent
networks are quite computationally intensive.
If you try this script on new data, make sure your corpus
has at least ~100k characters. ~1M is better.
'''from __future__ import print_function
from keras.models import Sequential
from keras.layers import Dense, Activation, Dropout
from keras.layers import LSTM
from keras.optimizers import RMSprop
from keras.utils.data_utils import get_file
import numpy as np
import random
import sysstart_time = time.time()
output_file_handler = open('out.log', 'w')
sys.stdout = output_file_handlerpath = get_file('nietzsche.txt', origin="https://s3.amazonaws.com/text-datasets/nietzsche.txt")
text = open(path).read().lower()
print('corpus length:', len(text))chars = sorted(list(set(text)))
print('total chars:', len(chars))
char_indices = dict((c, i) for i, c in enumerate(chars))
indices_char = dict((i, c) for i, c in enumerate(chars))# cut the text in semi-redundant sequences of maxlen characters
maxlen = 40
step = 3
sentences = []
next_chars = []
for i in range(0, len(text) - maxlen, step):sentences.append(text[i: i + maxlen])next_chars.append(text[i + maxlen])
print('nb sequences:', len(sentences))print('Vectorization...')
X = np.zeros((len(sentences), maxlen, len(chars)), dtype=np.bool)
y = np.zeros((len(sentences), len(chars)), dtype=np.bool)
for i, sentence in enumerate(sentences):for t, char in enumerate(sentence):X[i, t, char_indices[char]] = 1y[i, char_indices[next_chars[i]]] = 1# build the model: a single LSTM
print('Build model...')
model = Sequential()
model.add(LSTM(128, input_shape=(maxlen, len(chars))))
model.add(Dense(len(chars)))
model.add(Activation('softmax'))optimizer = RMSprop(lr=0.01)
model.compile(loss='categorical_crossentropy', optimizer=optimizer)def sample(preds, temperature=1.0):# helper function to sample an index from a probability arraypreds = np.asarray(preds).astype('float64')preds = np.log(preds) / temperatureexp_preds = np.exp(preds)preds = exp_preds / np.sum(exp_preds)probas = np.random.multinomial(1, preds, 1)return np.argmax(probas)# train the model, output generated text after each iteration
for iteration in range(1, 60):end_time = time.time()print 'training used time : ' + str(end_time - start_time)print()print('-' * 50)print('Iteration', iteration)model.fit(X, y, batch_size=128, nb_epoch=1)start_index = random.randint(0, len(text) - maxlen - 1)for diversity in [0.2, 0.5, 1.0, 1.2]:print()print('----- diversity:', diversity)generated = ''sentence = text[start_index: start_index + maxlen]generated += sentenceprint('----- Generating with seed: "' + sentence + '"')sys.stdout.write(generated)for i in range(400):x = np.zeros((1, maxlen, len(chars)))for t, char in enumerate(sentence):x[0, t, char_indices[char]] = 1.preds = model.predict(x, verbose=0)[0]next_index = sample(preds, diversity)next_char = indices_char[next_index]generated += next_charsentence = sentence[1:] + next_charsys.stdout.write(next_char)sys.stdout.flush()print()
  • 1
  • 2
  • 3
  • 4
  • 5
  • 6
  • 7
  • 8
  • 9
  • 10
  • 11
  • 12
  • 13
  • 14
  • 15
  • 16
  • 17
  • 18
  • 19
  • 20
  • 21
  • 22
  • 23
  • 24
  • 25
  • 26
  • 27
  • 28
  • 29
  • 30
  • 31
  • 32
  • 33
  • 34
  • 35
  • 36
  • 37
  • 38
  • 39
  • 40
  • 41
  • 42
  • 43
  • 44
  • 45
  • 46
  • 47
  • 48
  • 49
  • 50
  • 51
  • 52
  • 53
  • 54
  • 55
  • 56
  • 57
  • 58
  • 59
  • 60
  • 61
  • 62
  • 63
  • 64
  • 65
  • 66
  • 67
  • 68
  • 69
  • 70
  • 71
  • 72
  • 73
  • 74
  • 75
  • 76
  • 77
  • 78
  • 79
  • 80
  • 81
  • 82
  • 83
  • 84
  • 85
  • 86
  • 87
  • 88
  • 89
  • 90
  • 91
  • 92
  • 93
  • 94
  • 95
  • 96
  • 97
  • 98
  • 99
  • 100
  • 101
  • 102
  • 103
  • 104
  • 105
  • 106
  • 107
  • 108
  • 1
  • 2
  • 3
  • 4
  • 5
  • 6
  • 7
  • 8
  • 9
  • 10
  • 11
  • 12
  • 13
  • 14
  • 15
  • 16
  • 17
  • 18
  • 19
  • 20
  • 21
  • 22
  • 23
  • 24
  • 25
  • 26
  • 27
  • 28
  • 29
  • 30
  • 31
  • 32
  • 33
  • 34
  • 35
  • 36
  • 37
  • 38
  • 39
  • 40
  • 41
  • 42
  • 43
  • 44
  • 45
  • 46
  • 47
  • 48
  • 49
  • 50
  • 51
  • 52
  • 53
  • 54
  • 55
  • 56
  • 57
  • 58
  • 59
  • 60
  • 61
  • 62
  • 63
  • 64
  • 65
  • 66
  • 67
  • 68
  • 69
  • 70
  • 71
  • 72
  • 73
  • 74
  • 75
  • 76
  • 77
  • 78
  • 79
  • 80
  • 81
  • 82
  • 83
  • 84
  • 85
  • 86
  • 87
  • 88
  • 89
  • 90
  • 91
  • 92
  • 93
  • 94
  • 95
  • 96
  • 97
  • 98
  • 99
  • 100
  • 101
  • 102
  • 103
  • 104
  • 105
  • 106
  • 107
  • 108

结合word2vec的文本生成算法

此代码还未完成,将来我再抽空将它完成,这里只是给一个思路。 
更多代码参考github

'''Example script to generate text using keras and word2vecAt least 20 epochs are required before the generated text
starts sounding coherent.It is recommended to run this script on GPU, as recurrent
networks are quite computationally intensive.'''from __future__ import print_function
from keras.models import Sequential
from keras.layers import Dense, Activation, Dropout
from keras.layers import LSTM
from keras.optimizers import RMSprop
from keras.utils.data_utils import get_file
from nltk import tokenize
import numpy as np
import random
import sys
import os
import nltkimport gensim, logging
import os
logging.basicConfig(format='%(asctime)s : %(levelname)s : %(message)s', level=logging.INFO)# a memory-friendly iterator
class MySentences(object):def __init__(self, dirname, min_word_count_in_sentence = 1):self.dirname = dirnameself.min_word_count_in_sentence = min_word_count_in_sentence;def process_line(self, line):words = line.split()return wordsdef __iter__(self):for fname in os.listdir(self.dirname):for line in open(os.path.join(self.dirname, fname)):processed_line = self.process_line(line)if (len(processed_line) >= self.min_word_count_in_sentence):yield processed_lineelse:continuedef generate_word2vec_train_files(input_dir, output_dir, sentence_start_token, sentence_end_token, unkown_token, word_min_count, word2vec_size):print('generate_word2vec_train_files...')tmp_word2vec_model = gensim.models.Word2Vec(min_count = word_min_count, size = word2vec_size)original_sentences = MySentences(input_dir)tmp_word2vec_model.build_vocab(original_sentences)original_word2vec_vocab = tmp_word2vec_model.vocabmake_dir_if_not_exist(output_dir)for fname in os.listdir(input_dir):output_file = open(os.path.join(output_dir, fname), 'w')line_count = 0for line in open(os.path.join(input_dir, fname)):line = line.strip(' -=:\"\'_*\n')if len(line) == 0:continuesentences = tokenize.sent_tokenize(line)for idx, sentence in enumerate(sentences):words = sentence.split()for word_idx, word in enumerate(words):if word not in original_word2vec_vocab:words[word_idx] = unkown_token#TODOsentence = " ".join(word for word in words)sentences[idx] = sentence_start_token + ' ' + sentence + ' ' + sentence_end_token + '\n'line_count += len(sentences)output_file.writelines(sentences)output_file.close()print("line_count", line_count)def train_word2vec_model(dataset_dir, save_model_file, word_min_count, word2vec_size):print('train_word2vec_model...')word2vec_model = gensim.models.Word2Vec(min_count = word_min_count, size = word2vec_size)train_sentences = MySentences(dataset_dir)word2vec_model.build_vocab(train_sentences)sentences = MySentences(dataset_dir)word2vec_model.train(sentences)word2vec_model.save(save_model_file)return word2vec_modeldef load_existing_word2vec_model(model_file_path):model =Noneif os.path.exists(model_file_path):print("load existing model...")model = gensim.models.Word2Vec.load(model_file_path)return modeldef generate_rnn_train_files(input_dir, output_dir, fixed_sentence_len, unkown_token, sentence_start_token, sentence_end_token):print('generate_rnn_train_files...')make_dir_if_not_exist(output_dir)long_than_fixed_len_count = 0;total_sentence_count = 0;for fname in os.listdir(input_dir):output_file = open(os.path.join(output_dir, fname), 'w')for sentence in open(os.path.join(input_dir, fname)):sentence = sentence.strip('\n')total_sentence_count += 1words = sentence.split()len_of_sentence = len(words)if len_of_sentence > fixed_sentence_len:long_than_fixed_len_count += 1continueelif len_of_sentence < fixed_sentence_len:for i in range(0, fixed_sentence_len - len_of_sentence):sentence = sentence + ' ' + sentence_end_tokenoutput_file.write(sentence + '\n')output_file.close()print ("sentence longer than fixed_len : %d / %d" %(long_than_fixed_len_count, total_sentence_count))def train_rnn_model(dataset_dir, fixed_sentence_len, word2vec_size, word2vec_model):# build the model: a single LSTMprint('Build RNN model...')rnn_model = Sequential()rnn_model.add(LSTM(128, input_shape=(fixed_sentence_len, word2vec_size)))rnn_model.add(Dense(word2vec_size))rnn_model.add(Activation('softmax'))optimizer = RMSprop(lr=0.01)rnn_model.compile(loss='categorical_crossentropy', optimizer=optimizer)print('Generating RNN train data...')X = [] #np.zeros((0, fixed_sentence_len, word2vec_size), dtype=np.float32)y = [] #np.zeros((0, word2vec_size), dtype=np.float32)sentences = MySentences(dataset_dir)for sentence in sentences:tmp_x = np.asarray([word2vec_model[w] for w in sentence[:-1]])tmp_y = np.asarray([word2vec_model[w] for w in sentence[1:]])tmp_x = np.zeros((fixed_sentence_len, word2vec_size), dtype=np.float32)for idx, word in enumerate(sentence):tmp_x[idx] = word2vec_model[word]X.append()# X, y = generate_rnn_train_data()print(X)print(y)print('Generate RNN train data end!')# rnn_model.fit()print('Build RNN model over!')return rnn_modelclass Config:WORD2VEC_MODE_FILE = "./word2vec_model.model"ORIGINAL_TRAIN_DATASET_DIR = "./small_train_text"WORD2VEC_TRAIN_DATASET_DIR = "./small_word2vec_train_text"RNN_TRAIN_DATASET_DIR = "./small_rnn_train_text"SENTENCE_START_TOKEN = "SENTENCE_START_TOKEN"SENTENCE_END_TOKEN = "SENTENCE_END_TOKEN"UNKNOWN_TOKEN = "UNKNOWN_TOKEN"FIXED_SENTENCE_LEN = 30MIN_COUNT = 2;WORD2VEC_SIZE = 20;def make_dir_if_not_exist(dirpath):if not os.path.exists(dirpath):os.mkdir(dirpath)def main():# word2vec trainword2vec_model = load_existing_word2vec_model(Config.WORD2VEC_MODE_FILE)if word2vec_model == None:generate_word2vec_train_files(Config.ORIGINAL_TRAIN_DATASET_DIR, Config.WORD2VEC_TRAIN_DATASET_DIR,Config.SENTENCE_START_TOKEN, Config.SENTENCE_END_TOKEN, Config.UNKNOWN_TOKEN, Config.MIN_COUNT, Config.WORD2VEC_SIZE)word2vec_model = train_word2vec_model(Config.WORD2VEC_TRAIN_DATASET_DIR, Config.WORD2VEC_MODE_FILE, Config.MIN_COUNT, Config.WORD2VEC_SIZE)# rnn traingenerate_rnn_train_files(Config.WORD2VEC_TRAIN_DATASET_DIR, Config.RNN_TRAIN_DATASET_DIR,Config.FIXED_SENTENCE_LEN, Config.UNKNOWN_TOKEN,Config.SENTENCE_START_TOKEN, Config.SENTENCE_END_TOKEN)rnn_model = train_rnn_model(Config.RNN_TRAIN_DATASET_DIR, Config.FIXED_SENTENCE_LEN, Config.WORD2VEC_SIZE, word2vec_model)main()# if __name__ == "__main__":
#     main()
# 

后记

就目前而言,利用基于RNN的文本生成算法虽然能够生成通顺的句子,却远远不能用来创作文章。因为RNN本质上还是基于词句在训练集中出现的概率来生成文本,这种暴力模仿的文本生成算法终究不是根本的解决之道,将来融合人工智能领域的其他的一些算法或许能够达到比较好的效果。(完)









.........................
.https://www.huxiu.com/member/1476229.html
https://www.huxiu.com/member/1476300.html
https://www.huxiu.com/member/1477666.html
https://www.huxiu.com/member/1485137.html
https://www.huxiu.com/member/1485142.html
https://www.huxiu.com/member/1485146.html
https://www.huxiu.com/member/1485152.html
https://www.huxiu.com/member/1485159.html
https://www.huxiu.com/member/1485163.html
https://www.huxiu.com/member/1485168.html
https://www.huxiu.com/member/1485170.html
https://www.huxiu.com/member/1485182.html
https://www.huxiu.com/member/1485186.html
https://www.huxiu.com/member/1485193.html
https://www.huxiu.com/member/1485198.html
https://www.huxiu.com/member/1485202.html
https://www.huxiu.com/member/1485216.html
https://www.huxiu.com/member/1485221.html
https://www.huxiu.com/member/1485225.html
https://www.huxiu.com/member/1485240.html
https://www.huxiu.com/member/1485243.html
https://www.huxiu.com/member/1485256.html
https://www.huxiu.com/member/1485260.html
https://www.huxiu.com/member/1485268.html
https://www.huxiu.com/member/1485273.html
https://www.huxiu.com/member/1485278.html
https://www.huxiu.com/member/1485281.html
https://www.huxiu.com/member/1485282.html
https://www.huxiu.com/member/1485292.html
https://www.huxiu.com/member/1485293.html
https://www.huxiu.com/member/1485296.html
https://www.huxiu.com/member/1485299.html
https://www.huxiu.com/member/1485304.html
https://www.huxiu.com/member/1485307.html
https://www.huxiu.com/member/1485309.html
https://www.huxiu.com/member/1485311.html
https://www.huxiu.com/member/1485313.html
https://www.huxiu.com/member/1485315.html
https://www.huxiu.com/member/1485318.html
https://www.huxiu.com/member/1485323.html
https://www.huxiu.com/member/1485328.html
https://www.huxiu.com/member/1485330.html
https://www.huxiu.com/member/1485332.html
https://www.huxiu.com/member/1485333.html
https://www.huxiu.com/member/1485338.html
https://www.huxiu.com/member/1485341.html
https://www.huxiu.com/member/1485343.html
https://www.huxiu.com/member/1485348.html
https://www.huxiu.com/member/1485352.html
https://www.huxiu.com/member/1485354.html
https://www.huxiu.com/member/1485357.html
https://www.huxiu.com/member/1485363.html
https://www.huxiu.com/member/1485365.html
https://www.huxiu.com/member/1485376.html
https://www.huxiu.com/member/1485379.html
https://www.huxiu.com/member/1485384.html
https://www.huxiu.com/member/1485386.html
https://www.huxiu.com/member/1485388.html
https://www.huxiu.com/member/1485390.html
https://www.huxiu.com/member/1485393.html
https://www.huxiu.com/member/1485397.html
https://www.huxiu.com/member/1485401.html
https://www.huxiu.com/member/1485404.html
https://www.huxiu.com/member/1485406.html
https://www.huxiu.com/member/1485413.html
https://www.huxiu.com/member/1485411.html
https://www.huxiu.com/member/1485416.html
https://www.huxiu.com/member/1485419.html
https://www.huxiu.com/member/1485422.html
https://www.huxiu.com/member/1485424.html
https://www.huxiu.com/member/1485426.html
https://www.huxiu.com/member/1485428.html
https://www.huxiu.com/member/1485431.html
https://www.huxiu.com/member/1485435.html
https://www.huxiu.com/member/1485439.html
https://www.huxiu.com/member/1485441.html
https://www.huxiu.com/member/1485445.html
https://www.huxiu.com/member/1485450.html
https://www.huxiu.com/member/1485454.html
https://www.huxiu.com/member/1485456.html
https://www.huxiu.com/member/1485459.html
https://www.huxiu.com/member/1485465.html
https://www.huxiu.com/member/1485471.html
https://www.huxiu.com/member/1485475.html
https://www.huxiu.com/member/1485479.html
https://www.huxiu.com/member/1485481.html
https://www.huxiu.com/member/1485485.html
https://www.huxiu.com/member/1485487.html
https://www.huxiu.com/member/1485491.html
https://www.huxiu.com/member/1485496.html
https://www.huxiu.com/member/1485498.html
https://www.huxiu.com/member/1485502.html
https://www.huxiu.com/member/1485504.html
https://www.huxiu.com/member/1485507.html
https://www.huxiu.com/member/1485508.html
https://www.huxiu.com/member/1485509.html
https://www.huxiu.com/member/1485510.html
https://www.huxiu.com/member/1485511.html
https://www.huxiu.com/member/1485512.html
https://www.huxiu.com/member/1485513.html
https://www.huxiu.com/member/1485514.html
https://www.huxiu.com/member/1485515.html
https://www.huxiu.com/member/1640798.html
https://www.huxiu.com/member/1485516.html
https://www.huxiu.com/member/1640799.html
https://www.huxiu.com/member/1640800.html
https://www.huxiu.com/member/1640801.html
https://www.huxiu.com/member/1640804.html
https://www.huxiu.com/member/1640812.html
https://www.huxiu.com/member/1640814.html
https://www.huxiu.com/member/1640815.html
https://www.huxiu.com/member/1640818.html
https://www.huxiu.com/member/1640820.html
https://www.huxiu.com/member/1640822.html
https://www.huxiu.com/member/1640824.html
https://www.huxiu.com/member/1640825.html
https://www.huxiu.com/member/1640827.html
https://www.huxiu.com/member/1640829.html
https://www.huxiu.com/member/1640832.html
https://www.huxiu.com/member/1640835.html
https://www.huxiu.com/member/1640837.html
https://www.huxiu.com/member/1640839.html
https://www.huxiu.com/member/1640841.html
https://www.huxiu.com/member/1640842.html
https://www.huxiu.com/member/1640844.html
https://www.huxiu.com/member/1640847.html
https://www.huxiu.com/member/1640849.html
https://www.huxiu.com/member/1640852.html
https://www.huxiu.com/member/1640853.html
https://www.huxiu.com/member/1640854.html
https://www.huxiu.com/member/1640856.html
https://www.huxiu.com/member/1640859.html
https://www.huxiu.com/member/1640862.html
https://www.huxiu.com/member/1640863.html
https://www.huxiu.com/member/1640865.html
https://www.huxiu.com/member/1640868.html
https://www.huxiu.com/member/1640871.html
https://www.huxiu.com/member/1640873.html
https://www.huxiu.com/member/1640874.html
https://www.huxiu.com/member/1640876.html
https://www.huxiu.com/member/1640879.html
https://www.huxiu.com/member/1640883.html
https://www.huxiu.com/member/1640885.html
https://www.huxiu.com/member/1640888.html
https://www.huxiu.com/member/1640890.html
https://www.huxiu.com/member/1640891.html
https://www.huxiu.com/member/1640894.html
https://www.huxiu.com/member/1640895.html
https://www.huxiu.com/member/1640896.html
https://www.huxiu.com/member/1640899.html
https://www.huxiu.com/member/1640901.html
https://www.huxiu.com/member/1640903.html
https://www.huxiu.com/member/1640910.html
https://www.huxiu.com/member/1640905.html
https://www.huxiu.com/member/1640911.html
https://www.huxiu.com/member/1640913.html
https://www.huxiu.com/member/1640915.html
https://www.huxiu.com/member/1640918.html
https://www.huxiu.com/member/1640920.html
https://www.huxiu.com/member/1640923.html
https://www.huxiu.com/member/1640924.html
https://www.huxiu.com/member/1640926.html
https://www.huxiu.com/member/1640929.html
https://www.huxiu.com/member/1640930.html
https://www.huxiu.com/member/1640934.html
https://www.huxiu.com/member/1640936.html
https://www.huxiu.com/member/1640937.html
https://www.huxiu.com/member/1640938.html
https://www.huxiu.com/member/1640940.html
https://www.huxiu.com/member/1640943.html
https://www.huxiu.com/member/1640944.html
https://www.huxiu.com/member/1640946.html
https://www.huxiu.com/member/1640949.html
https://www.huxiu.com/member/1640951.html
https://www.huxiu.com/member/1640953.html
https://www.huxiu.com/member/1640956.html
https://www.huxiu.com/member/1640958.html
https://www.huxiu.com/member/1613327.html
https://www.huxiu.com/member/1640962.html
https://www.huxiu.com/member/1640965.html
https://www.huxiu.com/member/1640966.html
https://www.huxiu.com/member/1640967.html
https://www.huxiu.com/member/1640971.html
https://www.huxiu.com/member/1640974.html
https://www.huxiu.com/member/1640974.html
https://www.huxiu.com/member/1640975.html
https://www.huxiu.com/member/1640977.html
https://www.huxiu.com/member/1640979.html
https://www.huxiu.com/member/1640982.html
https://www.huxiu.com/member/1640983.html
https://www.huxiu.com/member/1640988.html
https://www.huxiu.com/member/1640990.html
https://www.huxiu.com/member/1640994.html
https://www.huxiu.com/member/1640997.html
https://www.huxiu.com/member/1640998.html
https://www.huxiu.com/member/1641000.html
https://www.huxiu.com/member/1641001.html
https://www.huxiu.com/member/1641005.html
https://www.huxiu.com/member/1641008.html
https://www.huxiu.com/member/1640861.html
https://www.huxiu.com/member/1640864.html
https://www.huxiu.com/member/1640870.html
https://www.huxiu.com/member/1640872.html
https://www.huxiu.com/member/1640875.html
https://www.huxiu.com/member/1640878.html
https://www.huxiu.com/member/1640884.html
https://www.huxiu.com/member/1640886.html
https://www.huxiu.com/member/1640889.html
https://www.huxiu.com/member/1640893.html
https://www.huxiu.com/member/1640897.html
https://www.huxiu.com/member/1640900.html
https://www.huxiu.com/member/1640902.html
https://www.huxiu.com/member/1640904.html
https://www.huxiu.com/member/1640909.html
https://www.huxiu.com/member/1640912.html
https://www.huxiu.com/member/1640914.html
https://www.huxiu.com/member/1640917.html
https://www.huxiu.com/member/1640919.html
https://www.huxiu.com/member/1605879.html
https://www.huxiu.com/member/1640925.html
https://www.huxiu.com/member/1640928.html
https://www.huxiu.com/member/1640931.html
https://www.huxiu.com/member/1640935.html
https://www.huxiu.com/member/1640939.html
https://www.huxiu.com/member/1640942.html
https://www.huxiu.com/member/1640945.html
https://www.huxiu.com/member/1640948.html
https://www.huxiu.com/member/1640952.html
https://www.huxiu.com/member/1640955.html
https://www.huxiu.com/member/1640959.html
https://www.huxiu.com/member/1640964.html
https://www.huxiu.com/member/1640968.html
https://www.huxiu.com/member/1640973.html
https://www.huxiu.com/member/1640976.html
https://www.huxiu.com/member/1640981.html
https://www.huxiu.com/member/1640984.html
https://www.huxiu.com/member/1640986.html
http://my.csdn.net/xiaohan19901225
https://www.huxiu.com/member/1640991.html
https://www.huxiu.com/member/1640995.html
https://www.huxiu.com/member/1640999.html
https://www.huxiu.com/member/1641002.html
https://www.huxiu.com/member/1641006.html
https://www.huxiu.com/member/1640816.html
https://www.huxiu.com/member/1640816.html
https://www.huxiu.com/member/1640819.html
https://www.huxiu.com/member/1640823.html
https://www.huxiu.com/member/1640826.html
https://www.huxiu.com/member/1617675.html
https://www.huxiu.com/member/1647698.html
https://www.huxiu.com/member/1647706.html
https://www.huxiu.com/member/1647728.html
https://www.huxiu.com/member/1647732.html
..............












基于RNN的文本生成算法的代码运转相关推荐

  1. 基于rnn的语音降噪matlab,基于RNN的音频降噪算法 (附完整C代码)

    前几天无意间看到一个项目rnnoise. 基于RNN的音频降噪算法. 采用的是 GRU/LSTM 模型. 阅读下训练代码,可惜的是作者没有提供数据训练集. 不过基本可以断定他采用的数据集里,肯定有ur ...

  2. tensorflow循环神经网络(RNN)文本生成莎士比亚剧集

    tensorflow循环神经网络(RNN)文本生成莎士比亚剧集 我们将使用 Andrej Karpathy 在<循环神经网络不合理的有效性>一文中提供的莎士比亚作品数据集.给定此数据中的一 ...

  3. 音频降噪算法 java_基于RNN的音频降噪算法

    前几天无意间看到一个项目rnnoise. 项目地址: https://github.com/xiph/rnnoise 基于RNN的音频降噪算法. 采用的是 GRU/LSTM 模型. 阅读下训练代码,可 ...

  4. #今日论文推荐#NAACL 2022 | 基于Prompt的文本生成迁移学习

    #今日论文推荐#NAACL 2022 | 基于Prompt的文本生成迁移学习 预训练语言模型(PLM)通过微调在文本生成任务方面取得了显著进展.然而,在数据稀缺的情况下,微调 PLMs 是一项挑战.因 ...

  5. 连通域最小外接矩形算法原理_基于分割的文本检测算法之PSENet/PAN/DBNet

    1. 文本检测难点 文本内包含文本,艺术字体,任意方向 ,曲线文字 ,多语言,其他环境因素等是文本检测中的难点 2. 分割 问题1: 语义分割模型是对pixel进行分类,所以理论上讲,可以检测不规则的 ...

  6. 【视频课】生成对抗网络经典任务,详解基于GAN的图像生成算法!

    前言 欢迎大家关注有三AI的视频课程系列,我们的视频课程系列共分为5层境界,内容和学习路线图如下: 第1层:掌握学习算法必要的预备知识,包括Python编程,深度学习基础,数据使用,框架使用. 第2层 ...

  7. 人工智能--基于LSTM的文本生成

    学习目标: 理解文本生成的基本原理. 掌握利用LSTM生成唐诗宋词的方法. 学习内容: 利用如下代码和100首经典宋词的数据,基于LSTM生成新的词,并调整网络参数,提高生成的效果. poetry50 ...

  8. 国内版ChatGPT要来了?基于GPT的文本生成一键体验

    ★★★ 本文源自AI Studio社区精品项目,[点击此处]查看更多精品内容 >>> 项目概述 本项目从零开始构建了一个用于文本生成的语言模型,模型采用Transformer架构,数 ...

  9. 【NLP】基于GAN的文本生成综述

    论文一.<Generative Adversarial Nets>NIPS 2014 1.模型简述 这篇论文是最早提出 GAN 的文章,作者 Ian J. Goodfellow 提出了一种 ...

最新文章

  1. XYPatch,Windows XP手工升级补丁
  2. linux socket recv函数 MSG_PEEK标志介绍
  3. 【渝粤教育】 国家开放大学2020年春季 2246社会工作概论 参考试题
  4. android 联系人批量插入,GitHub - Atinerlengs/InsertDemo: android 简单的批量插入通话记录、联系人、短信demo...
  5. Nginx 配置UDP负载均衡
  6. 年轻人的第一套租房?小米成立新公司或涉房屋租赁业务
  7. python 简历_用Python翻译我的简历
  8. linux挂载光盘装ftp服务,linux ftp安装和配置
  9. 敏捷开发般若敏捷系列之三:什么是敏捷(下)(无住,不住于空,破空执,非法,非非法)...
  10. 简单的Hibernate入门简介
  11. sas数据集怎么导出_SAS:将proc步的输出导出为数据集
  12. 永洪科技怎么样_「永洪科技」北京永洪商智科技有限公司怎么样? - 职友集
  13. MBUS CJ/T 188水表协议 Meter-Bus总线
  14. 【Cesium】【vue】空间查询——量距(测量距离)、量面(测量面积)
  15. 虚拟机有网,主机没有网
  16. 数字证书认证机构(摘录自wiki百科)
  17. Java在线打开编辑PPT文档
  18. cmd md命令 创建文件夹
  19. 【计算机网络】网络通信协议
  20. Python从小白到新手

热门文章

  1. 中国大学MOOC-陈越、何钦铭-数据结构-2022秋期末考试
  2. DJI Flight Simulator 大疆飞行模拟器 安装教程
  3. 成己达人,千里之行始于足下
  4. 【计算机网络】数据链路层详解
  5. 人工蜂群(ABC)算法附matlab代码
  6. 随便唠唠 编译时注解 这个技术
  7. 分割含有中英文的字符串
  8. Android 9.0 在init.rc中启动一个服务
  9. LTE小区选择和小区重选(3
  10. 基于Python实现机器人自动走迷宫【100011016】