Meet Hedley, an AI, Linux, and smart sensor robotic skull
Rob Reilly explains how he brought Hedley, his robotic skull, to life. Hedley uses a JeVois smart machine vision sensor and artificial intelligence algorithms (developed by Laurent Itti) to track subjects as they move around in the skull's field of view. Come meet Hedley and learn about the latest developments in open source sensors, AI algorithms, and Linux-based physical computing.
Talk Title | Meet Hedley, an AI, Linux, and smart sensor robotic skull |
Speakers | Rob Reilly (Rob “drtorq” Reilly) |
Conference | O’Reilly Open Source Convention |
Conf Tag | Put open source to work |
Location | Portland, Oregon |
Date | July 16-19, 2018 |
URL | Talk Page |
Slides | Talk Slides |
Video | |
Rob Reilly explains how he brought Hedley, his robotic skull, to life. Hedley’s cranium is packed with a JeVois smart machine vision sensor, an early Arduino board, and a three-color LED eyeball in the left eye socket, among other components. The Arduino takes directions from the JeVois sensor to control Hedley’s movements. The skull attaches to a steampunk-themed mechanism and has both a Raspberry Pi 3 and Arduino Pro-Mini in its base. The Pi provides a monitoring and development environment, while the Pro-Mini controls how Hedley rises out of his storage box. By connecting to a projector and keyboard, Hedley can share how he interprets what he sees, through his on-board artificial intelligence algorithms. The JeVois sensor runs Linux on an four-core ARM processor, outputting augmented reality video and location data streams over USB and serial lines. In a live demo, you’ll watch Hedley’s analysis in real time, using the guvcview application via the JeVois-connected Pi 3. Come meet Hedley and learn about the latest developments in open source sensors, AI algorithms, and Linux-based physical computing. You’ll leave inspired to start building your own smart sensor, AI, and multi-Linux module-powered projects.