Five-senses data: Using your senses to improve data signal and value
Data should be something you can see, feel, hear, taste, and touch. Drawing on real-world examples, Cameron Turner, Brad Sarsfield, Hanna Kang-Brown, and Evan Macmillan cover the emerging field of sensory data visualization, including data sonification, and explain where it's headed in the future.
Talk Title | Five-senses data: Using your senses to improve data signal and value |
Speakers | |
Conference | Strata + Hadoop World |
Conf Tag | Make Data Work |
Location | New York, New York |
Date | September 27-29, 2016 |
URL | Talk Page |
Slides | Talk Slides |
Video | |
Data should be something you can see, feel, hear, taste, and touch. Cameron Turner, Brad Sarsfield, Hanna Kang-Brown, and Evan Macmillan cover the emerging field of sensory data visualization, including data sonification. In an anecdotal survey, they explore real-life examples of solutions deployed to production in industries spanning from consumer goods to heavy industrial and large-scale manufacturing to the IoT that take advantage of auditory, touch, and other senses as alternative means of what has traditionally been called data visualization. They then investigate the hypothesis that we might better consume information by moving beyond words, numbers, and pictures and start using sound, smell, and even taste as a means to better understand the state of the world. Topics will tie into Cameron’s recent interview on the O’Reilly Hardware Podcast, which focused on data sonification, extending these topics into the future of sensory data collection and consumption.