Lessons learned from Building a Social Robot
A glance into our future reveals an environment filled with machines targeted at helping us humans in our day to day tasks. Such a future can only exist if we can equip our machines with the ability to socialize and interact. The emerging field of social robots is aiming to create these interacting entities. Achieving such an ambitious goal requires a focus shift in the algorithms we are developing; from motion and control to decision making and dialog managing. From a function-first approach to one that thinks beyond the function. At Intuition Robotics we are building the tools and the technology for creating social agents. Our first application, ElliQ, is a social robot for the elderly aimed at reducing loneliness and increasing quality of life. Trying to create the best possible experience for ElliQ’s users, we are dealing with multiple challenges such as decision making at high uncertainty, multi-modal interaction design and personalization. Solving such challenges forces us to think outside the box combining algorithms from the world of cognitive computing, heuristics and a variety of learning techniques from simple statistical models to reinforcement learning. In this talk I will share some of the work we have been doing: I will present how we are adjusting the robot’s personality to the user using online learning. How active learning helps us to get to know our users better and finally how using cognitive computing we were able to create an agent with multiple goals.
Shay Zweig is the Head of AI at Intuition Robotics. His team combines research and reduction to practice of state of the art algorithms in decision making, robotic vision, dialog management and more. Shay received his PhD in Neuroscience From Bar Ilan University in Israel. His unique research, combined studying the visual system in the brain with investigating deep learning for computer vision.