Caitlin Ju for The Stanford Daily: Stanford researchers in the Computational Vision and Geometry Lab have designed an autonomously navigating robot prototype that they say understands implicit social conventions and human behavior. Named “Jackrabbot” after the swift but cautious jackrabbit, the visually intelligent and socially amicable robot is able to maneuver crowds and pedestrian spaces.
A white ball on wheels, the Jackrabbot is built on a Segway system and contains a computing unit and multiple sensors that acquire a 3-D structure of the surrounding environment. 360-degree cameras and GPS also enhance the robot’s navigation and detection capabilities.
To interact smoothly in public settings, the robot has to know how to avoid someone in a natural way, how to yield the right-of-way and how to leave personal space, among other skills. Cont'd...
TX2 robots: redefining performance by offering collaborative safety and high performance in a single machine. These pioneering robots can be used in all areas, including sensitive and restrictive environments, thanks to their unique features. Safety functions are easy and inexpensive to implement. They allow a higher level of interactions between robots and human operators, while still guaranteeing protection of your people, production and investment.