January 1, 2020

269 words 2 mins read

Deep learning methods for natural language processing

Deep learning methods for natural language processing

Garrett Hoffman walks you through deep learning methods for natural language processing and natural language understanding tasks, using a live example in Python and TensorFlow with StockTwits data. Methods include word2vec, recurrent neural networks and variants (LSTM, GRU), and convolutional neural networks.

Talk Title Deep learning methods for natural language processing
Speakers Garrett Hoffman (StockTwits)
Conference O’Reilly Artificial Intelligence Conference
Conf Tag Put AI to Work
Location New York, New York
Date April 16-18, 2019
URL Talk Page
Slides Talk Slides
Video

Garrett Hoffman walks you through deep learning methods for natural language processing and natural language understanding tasks, using a live example in Python and TensorFlow with StockTwits data. Methods include word2vec, recurrent neural networks and variants (LSTM, GRU), and convolutional neural networks. You’ll explore use cases and motivations for using these methods, gain a conceptual intuition about how these models work, and briefly review the mathematics that underly each methodology. Outline: Representation learning for text with word2vec word embeddings: The CBOW and skip gram models; how to train custom word embeddings; how to use pretrained word embeddings, such as those trained on Google News Traditional recurrent neural networks (RNNs): Why these types of models often perform better than traditional alternatives; variants to traditional RNNs, such as long short-term memory (LSTM) cells and gated recurrent units (GRUs); why these models provide improvements in accuracy Convolutional neural networks (CNNs): Why CNNs that are traditionally applied to computer vision are now being applied to language models; advantages that these have over RNNs; how RNNs can be used to learn generative models for text synthesis and the applications of this method

comments powered by Disqus