A novel adoption of LSTM in customer touchpoint prediction problems
KC Tung explains why LSTM provides great flexibility to model the consumer touchpoint sequence problem in a way that allows just-in-time insights about an advertising campaign's effectiveness across all touchpoints (channels), empowering advertisers to evaluate, adjust, or reallocate resources or investments in order to maximize campaign effectiveness.
Talk Title | A novel adoption of LSTM in customer touchpoint prediction problems |
Speakers | KC Tung (Microsoft) |
Conference | Artificial Intelligence Conference |
Conf Tag | Put AI to Work |
Location | San Francisco, California |
Date | September 5-7, 2018 |
URL | Talk Page |
Slides | Talk Slides |
Video | |
LSTM networks are widely used in solving sequence prediction problems, most notably in natural language processing (NLP) and neural machine translation (NMT). In particular, the sequence-to-sequence (seq2seq) model is the workhorse for translation, speech recognition, and text summarization challenges. If a collection of individual sequences of events are organized as a corpus, an LSTM model may be constructed to predict the target outcome (i.e., conversion) or target sequence of events (predicted touchpoint sequence that leads to conversion). The adoption of LSTM in touchpoint prediction stems from the need to model the customer journey or the conversion funnel as a series of touchpoints. For an advertiser or marketer, taking into account the sequence of events that leads to a conversion adds tremendous value to the understanding of conversion funnel and impact of types of touchpoints and can even identify high potential leads. With LSTM, touchpoint prediction can be framed to four different types of prediction problems: sequence prediction (model predicts future sequence to be TV-TV-buy), sequence classification (model predicts target outcome to be buy), sequence generation (model predicts target sequence that contains similar characteristics as input sequence—i.e., repeat buy), and sequence-to-sequence prediction (model predicts a target sequence that contains “buy”). KC Tung explains why LSTM provides great flexibility to model the consumer touchpoint sequence problem in a way that allows just-in-time insights about an advertising campaign’s effectiveness across all touchpoints (channels). LSTM models can be implemented at scale to identify potential marketing leads based on known touchpoint sequences during the campaign, empowering advertisers to evaluate, adjust, or reallocate resources or investments in order to maximize campaign effectiveness. Along the way, KC offers demos of LSTM models implemented in Keras and TensorFlow.