Managing data science in the enterprise
The honeymoon era of data science is ending, and accountability is coming. Successful data science leaders deliver measurable impact on an increasing share of an enterprises KPIs. Joshua Poduska and Patrick Harrison detail how leading organizations have taken a holistic approach to people, process, and technology to build a sustainable competitive advantage
Talk Title | Managing data science in the enterprise |
Speakers | Joshua Poduska (Domino Data Lab), Patrick Harrison (S&P Global) |
Conference | Strata Data Conference |
Conf Tag | Make Data Work |
Location | New York, New York |
Date | September 11-13, 2018 |
URL | Talk Page |
Slides | Talk Slides |
Video | |
The honeymoon era of data science is ending, and accountability is coming. Not content to wait for results that may or may not arrive, successful data science leaders deliver measurable impact on an increasing share of an enterprise’s KPIs. Joshua Poduska and Patrick Harrison detail how leading organizations have taken a holistic approach to people, process, and technology to build a sustainable competitive advantage. Outline: How to select the right data science project: Many organizations start with the data and look for something “interesting” rather than building a deep understanding of the existing business process and then pinpointing the decision point that can be augmented or automated. How to organize data science within the enterprise: There are trade-offs between centralized and federated models; alternatively, you could use a hybrid approach with something like a center of excellence. Why rapid prototyping and design sprints aren’t just for software developers: Leading organizations put prototyping ahead of the data collection process to ensure that stakeholder feedback is captured, increasing the probability of adoption. Some organizations even create synthetic data and naive baseline models to show how the model would impact existing business processes. Why order of magnitude ROI math should be on every hiring checklist: The ability to estimate the potential business impact of a change in a statistical measure is one the best predictors of success for a data science team. The difference between “pure research” and “applied templates”: 80% of data scientists think they’re doing the former, but realistically, the vast majority are applying well-known templates to novel business cases. Knowing which is which and how to manage them differently improves morale and output. Defining a stakeholder-centric project management process: The most common failure mode is when data science delivers results that are either too late or don’t fit into how the business works today, so results gather dust. Share insights early and often. Building for the scale that really matters: Many organizations optimize for scale of data but ultimately are overwhelmed by the scale of the growing data science team and its business stakeholders. Team throughput grinds to a crawl as information loss compounds from the number of interactions in a single project, much less a portfolio of hundreds or thousands of projects. Why time to iterate is the most important metric: Many organizations consider model deployment to be a moonshot, when it really should be laps around a racetrack. Minimal obstacles (without sacrificing rigorous review and checks) to test real results is another great predictor of data science success. Facebook and Google deploy new models in minutes, whereas large financial services companies can take 18 months. Why delivered is not done: Many organizations have such a hard time deploying a model into production that the data scientists breathe a sigh of relief and move on to the next project. Yet this neglects the critical process of monitoring to ensure the model performs as expected and is used appropriately. Measure everything, including yourself: Ironically, data scientists live in the world of measurement yet rarely turn that lens on themselves. Tracking patterns in aggregate workflows helps create modular templates and guides investment in internal tooling and people to alleviate bottlenecks. Risk and change management aren’t just for consultants: Data science projects don’t usually fail because of the math but rather because of the humans who use the math. Establish training, provide predetermined feedback channels, and measure usage and engagement to ensure success.