February 11, 2020

244 words 2 mins read

Practical feature engineering

Practical feature engineering

Feature engineering is generally the section that gets left out of machine learning books, but it's also the most critical part in practice. Ted Dunning explores techniques, a few well known, but some rarely spoken of outside the institutional knowledge of top teams, including how to handle categorical inputs, natural language, transactions, and more in the context of machine learning.

Talk Title Practical feature engineering
Speakers Ted Dunning (MapR, now part of HPE)
Conference Strata Data Conference
Conf Tag Make Data Work
Location New York, New York
Date September 24-26, 2019
URL Talk Page
Slides Talk Slides
Video

Feature engineering is generally the section that gets left out of machine learning books, but it’s also the most important part of successful models, even in today’s world of deep learning. While academic courses on machine learning focus on gradients and the latest flavor of recurrent network, Ted Dunning explores the techniques that practitioners in the real world are seeking out better features and figuring out how to extract value using a variety of time-honored (and occasionally exceptionally clever) heuristics. In a sense, feature engineering is the Rodney Dangerfield of machine learning, never getting any respect. It is, however, the task that will get you the most value for time spent in terms of model performance. This work is not just the work of the data scientist. Good features encode business realities as well and are the cross-product of good business sense and good data engineering.

comments powered by Disqus