February 15, 2020

171 words 1 min read

Deep learning methods for natural language processing

Deep learning methods for natural language processing

Garrett Hoffman walks you through deep learning methods for natural language processing and natural language understanding tasks, using a live example in Python and TensorFlow with StockTwits data. Methods include Word2Vec, recurrent neural networks (RNNs) and variants (long short-term memory [LSTM] and gated recurrent unit [GRU]), and convolutional neural networks.

Talk Title Deep learning methods for natural language processing
Speakers Garrett Hoffman (StockTwits)
Conference Strata Data Conference
Conf Tag Make Data Work
Location New York, New York
Date September 24-26, 2019
URL Talk Page
Slides Talk Slides
Video

Garrett Hoffman walks you through deep learning methods for natural language processing and natural language understanding tasks, using a live example in Python and TensorFlow with StockTwits data. Methods include Word2Vec, RNNs and variants (LSTM, GRU), and convolutional neural networks. You’ll explore use cases and motivations for using these methods, gain a conceptual intuition about how these models work, and briefly review the mathematics that underly each methodology. Outline: Representation learning for text with Word2Vec word embeddings Traditional RNNs Convolutional neural networks (CNNs)

comments powered by Disqus