December 24, 2019

264 words 2 mins read

word2vec and friends

word2vec and friends

Bruno Gonalves explores word2vec and its variations, discussing the main concepts and algorithms behind the neural network architecture used in word2vec and the word2vec reference implementation in TensorFlow. Bruno then presents a bird's-eye view of the emerging field of "anything"-2vec methods that use variations of the word2vec neural network architecture.

Talk Title word2vec and friends
Speakers Bruno Goncalves (Data For Science)
Conference Artificial Intelligence Conference
Conf Tag Put AI to Work
Location San Francisco, California
Date September 18-20, 2017
URL Talk Page
Slides Talk Slides
Video

Word embeddings have received a lot of attention ever since Tomas Mikolov published word2vec in 2013 and showed that the embeddings that a neural network learned by “reading” a large corpus of text preserved semantic relations between words. As a result, this type of embedding began to be studied in more detail and applied to more serious NLP and IR tasks, such as summarization and query expansion. More recently, researchers and practitioners alike have come to appreciate the power of this type of approach, creating a burgeoning cottage industry centered around applying Mikolov’s original approach to different areas. Bruno Gonçalves explores word2vec and its variations, discussing the main concepts and algorithms behind the neural network architecture used in word2vec and the word2vec reference implementation in TensorFlow. Bruno then presents a bird’s-eye view of the emerging field of 2vec methods (phrase2vec, doc2vec, dna2vec, node2vec, etc.) that use variations of the word2vec neural network architecture. Outline: Neural network architecture and algorithms underlying word2vec A brief refresher of TensorFlow A detailed discussion of TensorFlow’s reference implementation word2vec variations and their applications

comments powered by Disqus