Blog

Toyota Research Institute Has Integrated Robots With VR To Teach it New Skills

At the Toyota research institute (TRI) they have been researching and developing assistive home robots with the vision to empower and perform useful human-level tasks in real homes with practical applications.

For a robot to perform a set of tasks requires a set of data and key points to navigate through a set of tasks. Often times the robot is preprogrammed, like ones used in assembly line, but this time the robot is integrated with VR as a way for the robot to learn specific tasks being set by the human.

Using manipulation and mobile capabilities the teacher through VR can see a model of the robot as well as the object. This allows human trainers to teach robots arbitrary tasks with a variety of objects, instead of specific tasks like they would perform in a more controlled setting. The teacher can then teach the robot how to grip certain objects and how to identify which objects are of certain importance when commanded to perform a set of duties.

Each set of tasks that is performed by the user translates into data that is then used to teach behaviours linked to things in the environment. It also has the capability to identify certain floor surfaces that it can and cannot go over to ensure safe movement, which is achieved by using advanced motion sensors and visual in-depth to create a 3D visual representation of its surrounding environments.

Currently the robot is used as a prototype research and not as product concepts for consumers. However, as robots continue to advance we could soon see their presence in various settings in the not too distant future.