Thanks to improvements in algorithms, processors, and developer tools, it is increasingly possible to incorporate machine perception into embedded systems and other edge applications. Indeed, products incorporating machine perception are becoming common in virtually every industry, from healthcare to transportation to entertainment.
But implementing perception that works reliably in the real world remains challenging, and incorporating perception into resource-constrained devices is often difficult.
In this session, Jeff Bier (Founder of the Edge AI and Vision Alliance and Program Chair for the Embedded Vision Summit) will share an insider’s perspective on some of the most important recent developments and trends in the fast-moving world of embedded machine perception technologies and applications.
Jeff will also highlight some of the most exciting emerging applications for these technologies. In addition, he’ll explore how development practices are changing, and shine a light on the most common pain points reported by developers, drawing on results from the Edge AI and Vision Alliance’s annual Computer Vision and Perceptual AI Developer Survey.