Startups

  • 机器学习、深度学习、计算机视觉、大数据创业公司 - Startups in AI

Deep Reinforcement Learning

  • David Silver. "Tutorial: Deep Reinforcement Learning." ICML 2016.
  • David Silver’s course. "Reinforcement Learning". 2015.
  • Bahdanau, Dzmitry, Philemon Brakel, Kelvin Xu, Anirudh Goyal, Ryan Lowe, Joelle Pineau, Aaron Courville, and Yoshua Bengio. "An Actor-Critic Algorithm for Sequence Prediction." arXiv preprint arXiv:1607.07086 (2016).
  • Li, Jiwei, Will Monroe, Alan Ritter, and Dan Jurafsky. "Deep Reinforcement Learning for Dialogue Generation." arXiv preprint arXiv:1606.01541 (2016).
  • Pathak, Deepak, Pulkit Agrawal, Alexei A. Efros, and Trevor Darrell. "Curiosity-driven Exploration by Self-supervised Prediction." arXiv preprint arXiv:1705.05363 (2017).

Text Generation

  • Rennie, Steven J., Etienne Marcheret, Youssef Mroueh, Jarret Ross, and Vaibhava Goel. "Self-critical sequence training for image captioning." arXiv preprint arXiv:1612.00563 (2016).
  • Lin, Kevin, Dianqi Li, Xiaodong He, Zhengyou Zhang, and Ming-Ting Sun. "Adversarial Ranking for Language Generation." arXiv preprint arXiv:1705.11001 (2017).
  • Zhang, Li, Flood Sung, Feng Liu, Tao Xiang, Shaogang Gong, Yongxin Yang, and Timothy M. Hospedales. "Actor-Critic Sequence Training for Image Captioning." arXiv preprint arXiv:1706.09601 (2017).
  • Wiseman, Sam, Stuart M. Shieber, and Alexander M. Rush. "Challenges in Data-to-Document Generation." arXiv preprint arXiv:1707.08052 (2017).
  • Lebret, Rémi, David Grangier, and Michael Auli. "Neural text generation from structured data with application to the biography domain." arXiv preprint arXiv:1603.07771 (2016).
  • Sha, Lei, Lili Mou, Tianyu Liu, Pascal Poupart, Sujian Li, Baobao Chang, and Zhifang Sui. "Order-Planning Neural Text Generation From Structured Data." arXiv preprint arXiv:1709.00155 (2017).
  • Jiaxian Guo, Sidi Lu, Han Cai, Weinan Zhang, Yong Yu, Jun Wang. "Long Text Generation via Adversarial Training with Leaked Information." arXiv preprint arXiv:1709.08624 (2017).
  • Guu, Kelvin, Tatsunori B. Hashimoto, Yonatan Oren, and Percy Liang. "Generating Sentences by Editing Prototypes." arXiv preprint arXiv:1709.08878 (2017).
  • Tianyu Liu, Kexiang Wang, Lei Sha, Baobao Chang, Zhifang Sui. "Table-to-text Generation by Structure-aware Seq2seq Learnings." arXiv preprint arXiv:1711.09724 (2017).
  • Kahou, Samira Ebrahimi, Adam Atkinson, Vincent Michalski, Akos Kadar, Adam Trischler, and Yoshua Bengio. "FigureQA: An Annotated Figure Dataset for Visual Reasoning." arXiv preprint arXiv:1710.07300 (2017).

Text Summarization

  • Ryang, Seonggi, and Takeshi Abekawa. "Framework of automatic text summarization using reinforcement learning." In Proceedings of the 2012 Joint Conference on Empirical Methods in Natural Language Processing and Computational Natural Language Learning, pp. 256-265. Association for Computational Linguistics, 2012. [not neural-based methods]
  • King, Ben, Rahul Jha, Tyler Johnson, Vaishnavi Sundararajan, and Clayton Scott. "Experiments in Automatic Text Summarization Using Deep Neural Networks." Machine Learning (2011).
  • Liu, Yan, Sheng-hua Zhong, and Wenjie Li. "Query-Oriented Multi-Document Summarization via Unsupervised Deep Learning." AAAI. 2012.
  • Rioux, Cody, Sadid A. Hasan, and Yllias Chali. "Fear the REAPER: A System for Automatic Multi-Document Summarization with Reinforcement Learning." In EMNLP, pp. 681-690. 2014.[not neural-based methods]
  • PadmaPriya, G., and K. Duraiswamy. "An Approach For Text Summarization Using Deep Learning Algorithm." Journal of Computer Science 10, no. 1 (2013): 1-9.
  • Denil, Misha, Alban Demiraj, and Nando de Freitas. "Extraction of Salient Sentences from Labelled Documents." arXiv preprint arXiv:1412.6815 (2014).
  • Kågebäck, Mikael, et al. "Extractive summarization using continuous vector space models." Proceedings of the 2nd Workshop on Continuous Vector Space Models and their Compositionality (CVSC)@ EACL. 2014.
  • Denil, Misha, Alban Demiraj, Nal Kalchbrenner, Phil Blunsom, and Nando de Freitas. "Modelling, Visualising and Summarising Documents with a Single Convolutional Neural Network." arXiv preprint arXiv:1406.3830 (2014).
  • Cao, Ziqiang, Furu Wei, Li Dong, Sujian Li, and Ming Zhou. "Ranking with Recursive Neural Networks and Its Application to Multi-document Summarization." (AAAI'2015).
  • Fei Liu, Jeffrey Flanigan, Sam Thomson, Norman Sadeh, and Noah A. Smith. "Toward Abstractive Summarization Using Semantic Representations." NAACL 2015
  • Wenpeng Yin, Yulong Pei. "Optimizing Sentence Modeling and Selection for Document Summarization." IJCAI 2015
  • He, Zhanying, Chun Chen, Jiajun Bu, Can Wang, Lijun Zhang, Deng Cai, and Xiaofei He. "Document Summarization Based on Data Reconstruction." In AAAI. 2012.
  • Liu, He, Hongliang Yu, and Zhi-Hong Deng. "Multi-Document Summarization Based on Two-Level Sparse Representation Model." In Twenty-Ninth AAAI Conference on Artificial Intelligence. 2015.
  • Jin-ge Yao, Xiaojun Wan, Jianguo Xiao. "Compressive Document Summarization via Sparse Optimization." IJCAI 2015
  • Piji Li, Lidong Bing, Wai Lam, Hang Li, and Yi Liao. "Reader-Aware Multi-Document Summarization via Sparse Coding." IJCAI 2015.
  • Lopyrev, Konstantin. "Generating News Headlines with Recurrent Neural Networks." arXiv preprint arXiv:1512.01712 (2015). [The first paragraph as document.]
  • Alexander M. Rush, Sumit Chopra, Jason Weston. "A Neural Attention Model for Abstractive Sentence Summarization." EMNLP 2015. [sentence compression]
  • Hu, Baotian, Qingcai Chen, and Fangze Zhu. "LCSTS: a large scale chinese short text summarization dataset." arXiv preprint arXiv:1506.05865 (2015).
  • Gulcehre, Caglar, Sungjin Ahn, Ramesh Nallapati, Bowen Zhou, and Yoshua Bengio. "Pointing the Unknown Words." arXiv preprint arXiv:1603.08148 (2016).
  • Nallapati, Ramesh, Bing Xiang, and Bowen Zhou. "Abstractive Text Summarization Using Sequence-to-Sequence RNNs and Beyond." arXiv preprint arXiv:1602.06023 (2016). [sentence compression]
  • Sumit Chopra, Alexander M. Rush and Michael Auli. "Abstractive Sentence Summarization with Attentive Recurrent Neural Networks" NAACL 2016.
  • Jiatao Gu, Zhengdong Lu, Hang Li, Victor O.K. Li. "Incorporating Copying Mechanism in Sequence-to-Sequence Learning." ACL. (2016)
  • Jianpeng Cheng, Mirella Lapata. "Neural Summarization by Extracting Sentences and Words". ACL. (2016)
  • Zhang, Jianmin, Jin-ge Yao, and Xiaojun Wan. "Toward constructing sports news from live text commentary." In Proceedings of ACL. 2016.
  • Ziqiang Cao, Wenjie Li, Sujian Li, Furu Wei. "AttSum: Joint Learning of Focusing and Summarization with Neural Attention". arXiv:1604.00125 (2016)
  • Ayana, Shiqi Shen, Zhiyuan Liu, Maosong Sun. "Neural Headline Generation with Sentence-wise Optimization". arXiv:1604.01904 (2016)
  • Kikuchi, Yuta, Graham Neubig, Ryohei Sasano, Hiroya Takamura, and Manabu Okumura. "Controlling Output Length in Neural Encoder-Decoders." arXiv preprint arXiv:1609.09552 (2016).
  • Qian Chen, Xiaodan Zhu, Zhenhua Ling, Si Wei and Hui Jiang. "Distraction-Based Neural Networks for Document Summarization." IJCAI 2016.
  • Wang, Lu, and Wang Ling. "Neural Network-Based Abstract Generation for Opinions and Arguments." NAACL 2016.
  • Yishu Miao, Phil Blunsom. "Language as a Latent Variable: Discrete Generative Models for Sentence Compression." EMNLP 2016.
  • Takase, Sho, Jun Suzuki, Naoaki Okazaki, Tsutomu Hirao, and Masaaki Nagata. "Neural headline generation on abstract meaning representation." EMNLP, pp. 1054-1059. 2016.
  • Hongya Song, Zhaochun Ren, Piji Li, Shangsong Liang, Jun Ma, and Maarten de Rijke. Summarizing Answers in Non-Factoid Community Question-Answering. In WSDM 2017: The 10th International Conference on Web Search and Data Mining, 2017.
  • Wenyuan Zeng, Wenjie Luo, Sanja Fidler, Raquel Urtasun. "Efficient Summarization with Read-Again and Copy Mechanism." arXiv preprint arXiv:1611.03382 (2016).
  • Piji Li, Zihao Wang, Wai Lam, Zhaochun Ren, Lidong Bing. "Salience Estimation via Variational Auto-Encoders for Multi-Document Summarization". In AAAI, 2017.
  • Ramesh Nallapati, Feifei Zhai, Bowen Zhou. SummaRuNNer: A Recurrent Neural Network based Sequence Model for Extractive Summarization of Documents. In AAAI, 2017.
  • Ramesh Nallapati, Bowen Zhou, Mingbo Ma. "Classify or Select: Neural Architectures for Extractive Document Summarization." arXiv preprint arXiv:1611.04244 (2016).
  • Suzuki, Jun, and Masaaki Nagata. "Cutting-off Redundant Repeating Generations for Neural Abstractive Summarization." EACL 2017 (2017): 291.
  • Jiwei Tan and Xiaojun Wan. Abstractive Document Summarization with a Graph-Based Attentional Neural Model. ACL, 2017.
  • Preksha Nema, Mitesh M. Khapra, Balaraman Ravindran and Anirban Laha. Diversity driven attention model for query-based abstractive summarization. ACL,2017
  • Abigail See, Peter J. Liu and Christopher D. Manning. Get To The Point: Summarization with Pointer-Generator Networks. ACL, 2017.
  • Qingyu Zhou, Nan Yang, Furu Wei and Ming Zhou. Selective Encoding for Abstractive Sentence Summarization. ACL, 2017
  • Maxime Peyrard and Judith Eckle-Kohler. Supervised Learning of Automatic Pyramid for Optimization-Based Multi-Document Summarization. ACL, 2017.
  • Shashi Narayan, Nikos Papasarantopoulos, Mirella Lapata, Shay B. Cohen. "Neural Extractive Summarization with Side Information." arXiv preprint arXiv:1704.04530 (2017).
  • Romain Paulus, Caiming Xiong, Richard Socher. "A Deep Reinforced Model for Abstractive Summarization." (2017).
  • Shibhansh Dohare, Harish Karnick. "Text Summarization using Abstract Meaning Representation." arXiv:1706.01678 (2017).
  • Michihiro Yasunaga, Rui Zhang, Kshitijh Meelu, Ayush Pareek, Krishnan Srinivasan, Dragomir Radev. "Graph-based Neural Multi-Document Summarization." arXiv:1706.06681 (2017).
  • Piji Li, Wai Lam, Lidong Bing, and Zihao Wang. Deep Recurrent Generative Decoder for Abstractive Text Summarization. Proceedings of the Conference on Empirical Methods in Natural Language Processing (EMNLP'17). Sep 2017.
  • Piji Li, Wai Lam, Lidong Bing, Weiwei Guo, and Hang Li. Cascaded Attention based Unsupervised Information Distillation for Compressive Summarization. Proceedings of the Conference on Empirical Methods in Natural Language Processing (EMNLP'17). Sep 2017.
  • Piji Li, Lidong Bing, Wai Lam. Reader-Aware Multi-Document Summarization: An Enhanced Model and The First Dataset. Proceedings of the EMNLP 2017 Workshop on New Frontiers in Summarization (EMNLP-NewSum'17). Sep 2017.
  • Tan, Jiwei, Xiaojun Wan, and Jianguo Xiao. "From Neural Sentence Summarization to Headline Generation: A Coarse-to-Fine Approach." IJCAI 2017.
  • Ling, Jeffrey, and Alexander M. Rush. "Coarse-to-Fine Attention Models for Document Summarization." EMNLP 2017 (2017): 33.
  • Ziqiang Cao, Furu Wei, Wenjie Li, Sujian Li. "Faithful to the Original: Fact Aware Neural Abstractive Summarization." arXiv:1711.04434 (2017).
  • Angela Fan, David Grangier, Michael Auli. "Controllable Abstractive Summarization." arXiv:1711.05217 (2017).

Opinion Summarization

  • Wu, Haibing, Yiwei Gu, Shangdi Sun, and Xiaodong Gu. "Aspect-based Opinion Summarization with Convolutional Neural Networks." arXiv preprint arXiv:1511.09128 (2015).
  • Irsoy, Ozan, and Claire Cardie. "Opinion Mining with Deep Recurrent Neural Networks." In EMNLP, pp. 720-728. 2014.
  • Piji Li, Zihao Wang, Zhaochun Ren, Lidong Bing, Wai Lam. "Neural Rating Regression with Abstractive Tips Generation for Recommendation.". In SIGIR, pp xx-xx. 2017.

Reading Comprehension

  • Hermann, Karl Moritz, Tomas Kocisky, Edward Grefenstette, Lasse Espeholt, Will Kay, Mustafa Suleyman, and Phil Blunsom. "Teaching machines to read and comprehend." In Advances in Neural Information Processing Systems, pp. 1693-1701. 2015.
  • Hill, Felix, Antoine Bordes, Sumit Chopra, and Jason Weston. "The Goldilocks Principle: Reading Children's Books with Explicit Memory Representations." arXiv preprint arXiv:1511.02301 (2015).
  • Kadlec, Rudolf, Martin Schmid, Ondrej Bajgar, and Jan Kleindienst. "Text Understanding with the Attention Sum Reader Network." arXiv preprint arXiv:1603.01547 (2016).
  • Chen, Danqi, Jason Bolton, and Christopher D. Manning. "A thorough examination of the cnn/daily mail reading comprehension task." arXiv preprint arXiv:1606.02858 (2016).
  • Dhingra, Bhuwan, Hanxiao Liu, William W. Cohen, and Ruslan Salakhutdinov. "Gated-Attention Readers for Text Comprehension." arXiv preprint arXiv:1606.01549 (2016).
  • Sordoni, Alessandro, Phillip Bachman, and Yoshua Bengio. "Iterative Alternating Neural Attention for Machine Reading." arXiv preprint arXiv:1606.02245 (2016).
  • Trischler, Adam, Zheng Ye, Xingdi Yuan, and Kaheer Suleman. "Natural Language Comprehension with the EpiReader." arXiv preprint arXiv:1606.02270 (2016).
  • Yiming Cui, Zhipeng Chen, Si Wei, Shijin Wang, Ting Liu, Guoping Hu. "Attention-over-Attention Neural Networks for Reading Comprehension." arXiv preprint arXiv:1607.04423 (2016).
  • Yiming Cui, Ting Liu, Zhipeng Chen, Shijin Wang, Guoping Hu. "Consensus Attention-based Neural Networks for Chinese Reading Comprehension." arXiv preprint arXiv:1607.02250 (2016).
  • Daniel Hewlett, Alexandre Lacoste, Llion Jones, Illia Polosukhin, Andrew Fandrianto, Jay Han, Matthew Kelcey and David Berthelot. "WIKIREADING: A Novel Large-scale Language Understanding Task over Wikipedia." ACL (2016). pp. 1535-1545.
  • Minghao Hu, Yuxing Peng, Xipeng Qiu. "Mnemonic Reader for Machine Comprehension." arXiv:1705.02798 (2017).
  • Wenhui Wang, Nan Yang, Furu Wei, Baobao Chang and Ming Zhou. "R-NET: Machine Reading Comprehension with Self-matching Networks." ACL (2017).

Sentence Modelling

  • Kalchbrenner, Nal, Edward Grefenstette, and Phil Blunsom. "A convolutional neural network for modelling sentences." arXiv preprint arXiv:1404.2188 (2014).
  • Kim, Yoon. "Convolutional neural networks for sentence classification." arXiv preprint arXiv:1408.5882 (2014).
  • Le, Quoc V., and Tomas Mikolov. "Distributed representations of sentences and documents." arXiv preprint arXiv:1405.4053 (2014).
  • Yang, Zichao, Diyi Yang, Chris Dyer, Xiaodong He, Alex Smola, and Eduard Hovy. "Hierarchical Attention Networks for Document Classification." In Proceedings of the 2016 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies. 2016.

Reasoning

  • Peng, Baolin, Zhengdong Lu, Hang Li, and Kam-Fai Wong. "Towards Neural Network-based Reasoning." arXiv preprint arXiv:1508.05508 (2015).

Knowledge Engine

  • Bordes, Antoine, Nicolas Usunier, Alberto Garcia-Duran, Jason Weston, and Oksana Yakhnenko. "Translating embeddings for modeling multi-relational data." In Advances in Neural Information Processing Systems, pp. 2787-2795. 2013. TransE
  • Lin, Yankai, Shiqi Shen, Zhiyuan Liu, Huanbo Luan, and Maosong Sun. "Neural Relation Extraction with Selective Attention over Instances." ACL (2016)
  • TransXXX

Memory Networks

  • Graves, Alex, Greg Wayne, and Ivo Danihelka. "Neural turing machines." arXiv preprint arXiv:1410.5401 (2014).
  • Weston, Jason, Sumit Chopra, and Antoine Bordes. "Memory networks." ICLR (2014).
  • Sukhbaatar, Sainbayar, Jason Weston, and Rob Fergus. "End-to-end memory networks." In Advances in neural information processing systems, pp. 2440-2448. 2015.
  • Weston, Jason, Antoine Bordes, Sumit Chopra, Alexander M. Rush, Bart van Merriënboer, Armand Joulin, and Tomas Mikolov. "Towards ai-complete question answering: A set of prerequisite toy tasks." arXiv preprint arXiv:1502.05698 (2015).
  • Bordes, Antoine, Nicolas Usunier, Sumit Chopra, and Jason Weston. "Large-scale simple question answering with memory networks." arXiv preprint arXiv:1506.02075 (2015).
  • Kumar, Ankit, Ozan Irsoy, Jonathan Su, James Bradbury, Robert English, Brian Pierce, Peter Ondruska, Ishaan Gulrajani, and Richard Socher. "Ask me anything: Dynamic memory networks for natural language processing." arXiv preprint arXiv:1506.07285 (2015).
  • Dodge, Jesse, Andreea Gane, Xiang Zhang, Antoine Bordes, Sumit Chopra, Alexander Miller, Arthur Szlam, and Jason Weston. "Evaluating prerequisite qualities for learning end-to-end dialog systems." arXiv preprint arXiv:1511.06931 (2015).
  • Hill, Felix, Antoine Bordes, Sumit Chopra, and Jason Weston. "The Goldilocks Principle: Reading Children's Books with Explicit Memory Representations." arXiv preprint arXiv:1511.02301 (2015).
  • Weston, Jason. "Dialog-based Language Learning." arXiv preprint arXiv:1604.06045 (2016).
  • Bordes, Antoine, and Jason Weston. "Learning End-to-End Goal-Oriented Dialog." arXiv preprint arXiv:1605.07683 (2016).
  • Chandar, Sarath, Sungjin Ahn, Hugo Larochelle, Pascal Vincent, Gerald Tesauro, and Yoshua Bengio. "Hierarchical Memory Networks." arXiv preprint arXiv:1605.07427 (2016).
  • Jason Weston."Memory Networks for Language Understanding." ICML Tutorial 2016
  • Tang, Yaohua, Fandong Meng, Zhengdong Lu, Hang Li, and Philip LH Yu. "Neural Machine Translation with External Phrase Memory." arXiv preprint arXiv:1606.01792 (2016).
  • Wang, Mingxuan, Zhengdong Lu, Hang Li, and Qun Liu. "Memory-enhanced Decoder for Neural Machine Translation." arXiv preprint arXiv:1606.02003 (2016).
  • Xiong, Caiming, Stephen Merity, and Richard Socher. "Dynamic memory networks for visual and textual question answering." arXiv preprint arXiv:1603.01417 (2016).

Neural Structures

  • Srivastava, Rupesh Kumar, Klaus Greff, and Jürgen Schmidhuber. "Highway networks." arXiv preprint arXiv:1505.00387 (2015).
  • Srivastava, Rupesh K., Klaus Greff, and Jürgen Schmidhuber. "Training very deep networks." In Advances in Neural Information Processing Systems, pp. 2368-2376. 2015.
  • Vinyals, Oriol, Meire Fortunato, and Navdeep Jaitly. "Pointer networks." In Advances in Neural Information Processing Systems, pp. 2692-2700. 2015.
  • Rasmus, Antti, Mathias Berglund, Mikko Honkala, Harri Valpola, and Tapani Raiko. "Semi-supervised learning with ladder networks." In Advances in Neural Information Processing Systems, pp. 3546-3554. 2015.
  • Bengio, Samy, Oriol Vinyals, Navdeep Jaitly, and Noam Shazeer. "Scheduled sampling for sequence prediction with recurrent neural networks." In Advances in Neural Information Processing Systems, pp. 1171-1179. 2015.
  • He, Kaiming, Xiangyu Zhang, Shaoqing Ren, and Jian Sun. "Deep Residual Learning for Image Recognition." arXiv preprint arXiv:1512.03385 (2015).
  • He, Kaiming. "Tutorial: Deep Residual Networks: Deep Learning Gets Way Deeper." ICML 2016 tutorial.
  • Courbariaux, Matthieu, and Yoshua Bengio. "Binarynet: Training deep neural networks with weights and activations constrained to+ 1 or-1." arXiv preprint arXiv:1602.02830 (2016).
  • Jiatao Gu, Zhengdong Lu, Hang Li, Victor O.K. Li. "Incorporating Copying Mechanism in Sequence-to-Sequence Learning." ACL (2016)
  • Gulcehre, Caglar, Sungjin Ahn, Ramesh Nallapati, Bowen Zhou, and Yoshua Bengio. "Pointing the Unknown Words." arXiv preprint arXiv:1603.08148 (2016).
  • Andreas, Jacob, Marcus Rohrbach, Trevor Darrell, and Dan Klein. "Learning to compose neural networks for question answering." NAACL 2016.
  • Julian Georg Zilly, Rupesh Kumar Srivastava, Jan Koutník, Jürgen Schmidhuber. "Recurrent Highway Networks." arXiv preprint arXiv:1607.03474 (2016).
  • Zhilin Yang, Ye Yuan, Yuexin Wu, Ruslan Salakhutdinov, William W. Cohen. "Review Networks for Caption Generation." arXiv preprint arXiv:1605.07912 (2016).
  • Xiang Li, Tao Qin, Jian Yang, Tie-Yan Liu. "LightRNN: Memory and Computation-Efficient Recurrent Neural Networks." arXiv preprint arXiv:1610.09893 (2016).
  • Zhaopeng Tu, Yang Liu, Lifeng Shang, Xiaohua Liu, Hang Li. "Neural Machine Translation with Reconstruction." arXiv preprint arXiv:1611.01874 (2016).
  • Yingce Xia, Di He, Tao Qin, Liwei Wang, Nenghai Yu, Tie-Yan Liu, Wei-Ying Ma. "Dual Learning for Machine Translation." arXiv preprint arXiv:1611.00179 (2016).
  • Bahdanau, Dzmitry, Philemon Brakel, Kelvin Xu, Anirudh Goyal, Ryan Lowe, Joelle Pineau, Aaron Courville, and Yoshua Bengio. "An actor-critic algorithm for sequence prediction." arXiv preprint arXiv:1607.07086 (2016).
  • Kannan, Anjuli, and Oriol Vinyals. "Adversarial evaluation of dialogue models." arXiv preprint arXiv:1701.08198 (2017).
  • Kawthekar, Prasad, Raunaq Rewari, and Suvrat Bhooshan. "Evaluating Generative Models for Text Generation."
  • Li, Jiwei, Will Monroe, Tianlin Shi, Alan Ritter, and Dan Jurafsky. "Adversarial Learning for Neural Dialogue Generation." arXiv preprint arXiv:1701.06547 (2017).
  • Yang, Zhen, Wei Chen, Feng Wang, and Bo Xu. "Improving Neural Machine Translation with Conditional Sequence Generative Adversarial Nets." arXiv preprint arXiv:1703.04887 (2017).
  • Lijun Wu, Yingce Xia, Li Zhao, Fei Tian, Tao Qin, Jianhuang Lai, Tie-Yan Liu. "Adversarial Neural Machine Translation." IJCAI (2017).
  • Liu, Pengfei, Xipeng Qiu, and Xuanjing Huang. "Adversarial Multi-task Learning for Text Classification." arXiv preprint arXiv:1704.05742 (2017).
  • Jonas Gehring, Michael Auli, David Grangier, Denis Yarats, Yann N. Dauphin. "[Convolutional Sequence to Sequence Learning (https://arxiv.org/abs/1705.03122)." arXiv:1705.03122 (2017).
  • Lamb, Alex M., Anirudh Goyal ALIAS PARTH GOYAL, Ying Zhang, Saizheng Zhang, Aaron C. Courville, and Yoshua Bengio. "Professor forcing: A new algorithm for training recurrent networks." In Advances In Neural Information Processing Systems, pp. 4601-4609. 2016.
  • Rezende, Danilo Jimenez, Shakir Mohamed, and Daan Wierstra. "Stochastic backpropagation and approximate inference in deep generative models." arXiv preprint arXiv:1401.4082 (2014).
  • Kingma, Diederik P., and Max Welling. "Auto-encoding variational bayes." arXiv preprint arXiv:1312.6114 (2013).
  • Fabius, Otto, and Joost R. van Amersfoort. "Variational recurrent auto-encoders." arXiv preprint arXiv:1412.6581 (2014).
  • Bayer, Justin, and Christian Osendorfer. "Learning stochastic recurrent networks." arXiv preprint arXiv:1411.7610 (2014).
  • Bowman, Samuel R., Luke Vilnis, Oriol Vinyals, Andrew M. Dai, Rafal Jozefowicz, and Samy Bengio. "Generating sentences from a continuous space." arXiv preprint arXiv:1511.06349 (2015).
  • Gregor, Karol, Ivo Danihelka, Alex Graves, Danilo Jimenez Rezende, and Daan Wierstra. "DRAW: A recurrent neural network for image generation." arXiv preprint arXiv:1502.04623 (2015).
  • Makhzani, Alireza, Jonathon Shlens, Navdeep Jaitly, and Ian Goodfellow. "Adversarial autoencoders." arXiv preprint arXiv:1511.05644 (2015).
  • Johnson, Matthew J., David Duvenaud, Alexander B. Wiltschko, Sandeep R. Datta, and Ryan P. Adams. "Composing graphical models with neural networks for structured representations and fast inference." arXiv preprint arXiv:1603.06277 (2016).
  • Doersch, Carl. "Tutorial on Variational Autoencoders." arXiv preprint arXiv:1606.05908 (2016).
  • Chung, Junyoung, Kyle Kastner, Laurent Dinh, Kratarth Goel, Aaron C. Courville, and Yoshua Bengio. "A recurrent latent variable model for sequential data." In Advances in neural information processing systems, pp. 2980-2988. 2015.
  • Eslami, S. M., Nicolas Heess, Theophane Weber, Yuval Tassa, Koray Kavukcuoglu, and Geoffrey E. Hinton. "Attend, Infer, Repeat: Fast Scene Understanding with Generative Models." arXiv preprint arXiv:1603.08575 (2016).
  • Shengjia Zhao, Jiaming Song, Stefano Ermon. "InfoVAE: Information Maximizing Variational Autoencoders." arXiv:1706.02262 (2017).
  • Goodfellow, Ian, Jean Pouget-Abadie, Mehdi Mirza, Bing Xu, David Warde-Farley, Sherjil Ozair, Aaron Courville, and Yoshua Bengio. "Generative adversarial nets." In Advances in Neural Information Processing Systems, pp. 2672-2680. 2014
  • Radford, Alec, Luke Metz, and Soumith Chintala. "Unsupervised representation learning with deep convolutional generative adversarial networks." arXiv preprint arXiv:1511.06434 (2015).
  • Denton, Emily L., Soumith Chintala, and Rob Fergus. "Deep Generative Image Models using a Laplacian Pyramid of Adversarial Networks." In Advances in neural information processing systems, pp. 1486-1494. 2015.
  • Dosovitskiy, Alexey, Jost Tobias Springenberg, and Thomas Brox. "Learning to generate chairs with convolutional neural networks." In Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition, pp. 1538-1546. 2015.
  • Mathieu, Michael, Camille Couprie, and Yann LeCun. "Deep multi-scale video prediction beyond mean square error." arXiv preprint arXiv:1511.05440 (2015).
  • Salimans, Tim, Ian Goodfellow, Wojciech Zaremba, Vicki Cheung, Alec Radford, and Xi Chen. "Improved Techniques for Training GANs." arXiv preprint arXiv:1606.03498 (2016).
  • Chen, Xi, Yan Duan, Rein Houthooft, John Schulman, Ilya Sutskever, and Pieter Abbeel. "InfoGAN: Interpretable Representation Learning by Information Maximizing Generative Adversarial Nets." arXiv preprint arXiv:1606.03657 (2016).
  • Im, Daniel Jiwoong, Chris Dongjoo Kim, Hui Jiang, and Roland Memisevic. "Generating images with recurrent adversarial networks." arXiv preprint arXiv:1602.05110 (2016).
  • Yu, Lantao, Weinan Zhang, Jun Wang, and Yong Yu. "SeqGAN: Sequence Generative Adversarial Nets with Policy Gradient." arXiv preprint arXiv:1609.05473 (2016).
  • Augustus Odena, Christopher Olah, Jonathon Shlens. "Conditional Image Synthesis With Auxiliary Classifier GANs." arXiv preprint arXiv:1610.09585 (2016).
  • Ian Goodfellow. "NIPS Tutorial: GANs", NIPS, 2016
  • Che, Tong, Yanran Li, Ruixiang Zhang, R. Devon Hjelm, Wenjie Li, Yangqiu Song, and Yoshua Bengio. "Maximum-Likelihood Augmented Discrete Generative Adversarial Networks." arXiv preprint arXiv:1702.07983 (2017).
  • Junbo (Jake) Zhao, Yoon Kim, Kelly Zhang, Alexander M. Rush, Yann LeCun. "Adversarially Regularized Autoencoders for Generating Discrete Structures." arXiv preprint arXiv:1706.04223 (2017).
  • Mike Lewis Denis Yarats Yann N. Dauphin Devi Parikh Dhruv Batra . "Deal or No Deal? End-to-End Learning for Negotiation Dialogues." (2017).
  • Mihaela Rosca, Balaji Lakshminarayanan, David Warde-Farley, Shakir Mohamed. "Variational Approaches for Auto-Encoding Generative Adversarial Networks." arXiv preprint arXiv:1706.04987 (2017).
  • Goyal, Prasoon, Zhiting Hu, Xiaodan Liang, Chenyu Wang, and Eric Xing. "Nonparametric Variational Auto-encoders for Hierarchical Representation Learning." arXiv preprint arXiv:1703.07027 (2017).
  • Sabour, Sara, Nicholas Frosst, and Geoffrey Hinton. "Dynamic Routing between Capsules." (2017).

Recommendation System

  • Salakhutdinov, Ruslan, Andriy Mnih, and Geoffrey Hinton. "Restricted Boltzmann machines for collaborative filtering." In Proceedings of the 24th international conference on Machine learning, pp. 791-798. ACM, 2007.
  • Wang, Hao, Xingjian Shi, and Dit-Yan Yeung. "Relational Stacked Denoising Autoencoder for Tag Recommendation." In AAAI, pp. 3052-3058. 2015.
  • Wang, Hao, Naiyan Wang, and Dit-Yan Yeung. "Collaborative deep learning for recommender systems." In Proceedings of the 21th ACM SIGKDD International Conference on Knowledge Discovery and Data Mining, pp. 1235-1244. ACM, 2015.
  • Covington, Paul, Jay Adams, and Emre Sargin. "Deep neural networks for youtube recommendations." In Proceedings of the 10th ACM Conference on Recommender Systems, pp. 191-198. ACM, 2016.
  • Devooght, Robin, and Hugues Bersini. "Collaborative Filtering with Recurrent Neural Networks." arXiv preprint arXiv:1608.07400 (2016).
  • Wang, Hao, S. H. I. Xingjian, and Dit-Yan Yeung. "Collaborative recurrent autoencoder: Recommend while learning to fill in the blanks." In Advances in Neural Information Processing Systems, pp. 415-423. 2016.
  • Tang, Jian, Yifan Yang, Sam Carton, Ming Zhang, and Qiaozhu Mei. "Context-aware Natural Language Generation with Recurrent Neural Networks." arXiv preprint arXiv:1611.09900 (2016).
  • Zhang, Fuzheng, Nicholas Jing Yuan, Defu Lian, Xing Xie, and Wei-Ying Ma. "Collaborative Knowledge Base Embedding for Recommender Systems." KDD, 2016.
  • Dong, Li, Shaohan Huang, Furu Wei, Mirella Lapata, Ming Zhou, and Ke XuΤ. "Learning to Generate Product Reviews from Attributes." EACL, 2017.
  • He, Xiangnan. "Neural Collaborative Filtering." WWW, 2017
  • Wu, Chao-Yuan, Amr Ahmed, Alex Beutel, Alexander J. Smola, and How Jing. "Recurrent Recommender Networks." Training 10, no. 2: 10-1.2017
  • Radford, Alec, Rafal Jozefowicz, and Ilya Sutskever. "Learning to generate reviews and discovering sentiment." arXiv preprint arXiv:1704.01444 (2017).
  • Piji Li, Zihao Wang, Zhaochun Ren, Lidong Bing, Wai Lam. "Neural Rating Regression with Abstractive Tips Generation for Recommendation.". In SIGIR, pp xx-xx. 2017.

Network Representation Learning

  • Must-read papers on network representation learning (NRL)/network embedding (NE)

Music Generation

  • Using machine learning to generate music

Computational Biology

  • Awesome DeepBio by Gökçen Eraslan

GO

  • Silver, David, Aja Huang, Chris J. Maddison, Arthur Guez, Laurent Sifre, George van den Driessche, Julian Schrittwieser et al. "Mastering the game of Go with deep neural networks and tree search." Nature 529, no. 7587 (2016): 484-489.
  • Tian, Yuandong, and Yan Zhu. "Better Computer Go Player with Neural Network and Long-term Prediction." arXiv preprint arXiv:1511.06410 (2015).

Stock Prediction

  • Xiao Ding, Yue Zhang, Ting Liu, Junwen Duan. "Deep Learning for Event-Driven Stock Prediction". IJCAI 2015.
  • Si, Jianfeng, Arjun Mukherjee, Bing Liu, Sinno Jialin Pan, Qing Li, and Huayi Li. "Exploiting Social Relations and Sentiment for Stock Prediction." EMNLP 2014.
  • Ding, Xiao, Yue Zhang, Ting Liu, and Junwen Duan. "Using Structured Events to Predict Stock Price Movement: An Empirical Investigation." EMNLP 2014.
  • Bollen, Johan, Huina Mao, and Xiaojun Zeng. "Twitter mood predicts the stock market." Journal of Computational Science 2, no. 1 (2011): 1-8.
  • Hengjian Jia. "Investigation Into The Effectiveness Of Long Short Term Memory Networks For Stock Price Prediction." arXiv:1603.07893. (2016)

转载于:https://www.cnblogs.com/weiyinfu/p/9710872.html

[转]深度学习论文推荐相关推荐

  1. 2021年3月四篇深度学习论文推荐

    这是Machine-Learning-Collage系列,每隔一周作者都会编写一个本周论文的幻灯片摘要.每月底所有的幻灯片画都会被集中到一个总结文章中.作者希望给读者一个直观和直观的一些最酷的趋势.以 ...

  2. 2023年2月的十篇深度学习论文推荐

    本月的论文包括语言模型.扩散模型.音乐生成.多模态等主题. 1.MusicLM: Generating Music From TextPage https://arxiv.org/abs/2301.1 ...

  3. Github标星24k,127篇经典论文下载,这份深度学习论文阅读路线图不容错过

    作者  | Floodsung 翻译 | 黄海广 来源 | 机器学习初学者(ID:ai-start-com) [导读]如果你是深度学习领域的新手,那么你可能会遇到的第一个问题是"我应该从哪篇 ...

  4. 深度学习在推荐领域的应用

    深度学习在推荐领域的应用 2017-05-31 20:50youtube/微博/社交 作者: 吴岸城,菱歌科技首席算法科学家,致力于深度学习在文本.图像.预测推荐领域的应用.曾在中兴通讯.亚信(中国) ...

  5. 深度学习的推荐模型(DLRMs):设计孪生掩码层高效学习维度自适应的Embedding...

    猜你喜欢 0.2021年10月份热门报告免费下载1.如何搭建一套个性化推荐系统?2.从零开始搭建创业公司后台技术栈3.全民K歌推荐系统架构.算法及后台设计4.微博推荐算法实践与机器学习平台演进5.腾讯 ...

  6. 深度学习论文阅读目标检测篇(一):R-CNN《Rich feature hierarchies for accurate object detection and semantic...》

    深度学习论文阅读目标检测篇(一):R-CNN<Rich feature hierarchies for accurate object detection and semantic segmen ...

  7. 深度学习在推荐领域的应用:Lookalike 算法

    本文主人公 英特 是一名传统的软件工程师,让我们与英特一起来研究如何实现自己的Lookalike算法,并尝试着在新浪微博上应用这一算法. 当2012 年Facebook 在广告领域开始应用定制化受众( ...

  8. 【更新于12.29】深度学习论文汇总

    本博客用于记录自己平时收集的一些不错的深度学习论文,近9成的文章都是引用量3位数以上的论文,剩下少部分来自个人喜好,本博客将伴随着我的研究生涯长期更新,如有错误或者推荐文章烦请私信. 深度学习书籍和入 ...

  9. 深度学习在推荐算法上的应用进展

    作者:赵鑫,中国人民大学信息学院讲师,微博:赵鑫RUC. 原文:RUC智能情报站 | 深度学习在推荐算法上的应用进展 摘要:最近几年是深度学习发展的黄金时间,在多个领域取得了重要进展,包括图像领域.语 ...

最新文章

  1. Nature综述:多年冻土的微生物组
  2. 对于原生代码使用Java线程的优缺点
  3. 散列表的设计与实现_散列表:如何实现word编辑器的拼写检查?
  4. .NET Core 单元测试
  5. 《软件安装》VMware 安装 centos8
  6. 数字签名和加密的基本原理及其区别?
  7. 神经网络技巧篇之寻找最优超参数
  8. QT学习笔记之MySql如何计算两个时间段相隔的天数
  9. idea 常用配置和快捷键
  10. Luogu P4403 [BJWC2008]秦腾与教学评估【二分答案】By cellur925
  11. [渝粤教育] 长沙航空职业技术学院 信息技术 参考 资料
  12. ognl概念和原理详解
  13. 星环inceptor建表公式以及各个表的区别联系
  14. 商品库存管理系统(c语言)
  15. 搭建hadoop集群,从安装虚拟机开始直到hadoop成功搭建
  16. 惯性导航讲解(概念以及主要部件的讲解)
  17. word论文排版,页码和页眉
  18. 怎么查看linux系统硬盘,查看Linux磁盘空间的八大方法
  19. 如何判断自己适不适合学计算机?
  20. 水星路由器是linux系统,Mercury水星无线路由器设置教程(Windows XP系统)

热门文章

  1. 有哪些免费提供游戏素材的网站
  2. 怎么让文字不再div溢出,文字在div里面
  3. 信号量机制(记录型)
  4. 如何写成高性能的代码(一):巧用Canvas绘制电子表格
  5. 如何使用计算机创电子表格,Numbers怎么创建电子表格 Numbers创建表格教程
  6. 【读书笔记】读书立身修心,重建自我保护 划分亲密领域【被讨厌的勇气-岸见一郎 古贺史健-人文社科/社会科学类】
  7. 【Luogu P6566】【NOI Online 入门组】观星
  8. 破除条块间的信息墙 ——玉溪市智慧城市建设经验介绍
  9. 吉大计算机在线作业一,吉大18春学期《计算机接口技术》在线作业一(参考答案)...
  10. legendre函数matlab,MATLAB的Legendre函数的Python等价物