Computer Vision Algorithms Developed for Samsung's Humanoid Robot to See and Map

Computer vision algorithms that enable Samsung's latest humanoid robot, Roboray, to build real-time 3D visual maps to move around more efficiently have been developed by researchers from the University of Bristol.

By using cameras, the robot builds a map reference relative to its surroundings and is able to "remember" where it has been before. The ability to build visual maps quickly and anywhere is essential for autonomous robot navigation, in particular when the robot gets into places that have no global positioning system (GPS) signals or other references.


Roboray is one of the most advanced humanoid robots in the world, with a height of 140 cm and a weight of 50 kg. It has a stereo camera on its head and 53 different actuators including six for each leg and 12 for each hand.

The robot features a range of novel technologies. In particular it walks in a more human-like manner by using what is known as dynamic walking. This means that the robot is falling at every step, using gravity to carry it forward without much energy use. This is the way humans walk and is in contrast to most other humanoid robots that "bend their knees to keep the centre of mass low and stable". This way of walking is also more challenging for the computer vision algorithms as objects in images move more quickly.

The Bristol team, who have been collaborating with Samsung Electronics in South Korea since 2009, was in charge of the computer vision aspects of 3D SLAM (simultaneous localisation and mapping).

Dr Walterio Mayol-Cuevas, Deputy Director of the Bristol Robotics Lab, Reader in the Department of Computer Science at the University of Bristol and leader of the team, said: "A humanoid robot has an ideal shape to use the same tools and spaces designed for people, as well as a good test bed to develop machine intelligence designed for human interaction.

"Robots that close the gap with human behaviours, such as by featuring dynamic walking, will not only allow more energy efficiency but be better accepted by people as they move in a more natural manner."

Dr Sukjune Yoon from the Samsung Advanced Institute of Technology (SAIT), speaking about the collaboration with the University of Bristol, said: "Bristol's visual SLAM and their other real-time visual technologies have been very beneficial for Samsung Electronics' humanoid robot project.

"Mapping in real-time for a biped humanoid is much harder than for wheeled vehicles not only because there is less constant contact with the ground. In the near future, it is expected that humanoid robotic technologies will be able to provide a valuable service to society with robots working alongside people."

The technology of rapid 3D visual mapping, developed at Bristol, is internationally renowned because of its ability to robustly track and recover from rapid motions and occlusions, which is essential for when the humanoid moves and turns at normal walking speeds. This Bristol work has been used for a number of applications outside robotics too, from augmented reality to commercial applications in the analysis of wearable Gaze data.

A paper describing some aspects of this collaboration with Samsung is published in the journal Advanced Robotics.

Paper: Real-time 3D simultaneous localization and map-building for a dynamic walking humanoid robot, Yoon, Sukjune, Seungyong Hyung, Minhyung Lee, Kyung Shik Roh, SungHwan Ahn, Andrew Gee, Pished Bunnun, Andrew Calway, and Waterio W. Mayol-Cuevas, Advanced Robotics, published online ahead-of-print, Volume 27, Issue 10, 2013.

Source: http://www.bristol.ac.uk

Featured Product

ST Robotics - K11R robot controller now available for your own robot!

ST Robotics - K11R robot controller now available for your own robot!

St Robotics is making the K11R robot controller available for any robot that uses stepping motors including the IGUS range. Providing you use low inductance motors the K11R will provide surprising power and speed from it's 55v DC supply and Gecko micro-stepping drives controlled by a fast DSP and micro-controller. The software is RoboForth II V17 embedded in the controller which together with the PC project supervisor gets your robot going within minutes. It is a text based conversational language that is so easy to use yet permits programs of great complexity when required. The kinematics are easily tailored to any size of robot and any number of axes from 3 to 6. The controller will also provide easy calibration to sensors and also reads back and compares encoders if you fit them. Speed, acceleration and rate of acceleration (3rd order) are all programmable as is emergency stop and many other features. Your positional data and programs may be saved on your PC and also in the controller's flash memory. The K11R will also control external devices such as pneumatic gripper, vacuum pickup, air cylinders and communicate with a PLC. Pricing starts at $2500. Contact ST at sales@strobotics.com