Global development and technology consultancy Cambridge Consultants has developed a robot that can emulate human touch. Hank uses sensors and soft grippers controlled by airflows.
The Human Touch
Sophie Hand | EU Automation
Can a robot have feelings? Some robots are programmed to show emotion, from online artificial intelligence (AI) chatbot assistants to Sophia the social humanoid robot who can give emotional responses in a conversation. These robots might be able to simulate emotions, but will they ever be able to experience senses such as touch? Here, Sophie Hand, UK country manager at industrial automation parts supplier EU Automation, explains how technology can help industrial robots to feel.
More industries are beginning to invest in automation to improve productivity across the supply chain. Robots are often introduced to an assembly line to complete jobs that require the accuracy, repeatability and speed that cannot be completed to the same standard by a human. These large, industrial robots can also complete the dull, dirty and dangerous jobs so that human workers can concentrate on other tasks.
Manufacturers can invest in robots to perform these more difficult tasks at a faster rate. However, there are still tasks on the assembly line that require the dexterity and touch of a human, particularly in the warehouse. Collaborative robots have become very popular in the last ten years as they can be placed anywhere in the facility to assist humans in their work without putting them in harm’s way. While this area of robotics is more precise and softer than an industrial robot it is still not the best way to handle fragile goods.
Fragile items handled in e-commerce facilities, such as groceries or glass bottles must be handled carefully to ensure the consumer receives the product in the condition that they expect. Humans often complete these jobs because they can adapt their movements and understand how much force to apply to each product, for example you will have to handle an egg differently to a packet of food.
Traditionally robots used for delicate pick and place applications are fitted with silicone grippers that can grasp or pinch the object and handle it without causing damage. They often combine cameras and sensors, and are programmed to handle a specific product, requiring reprogramming if the product changes.
If businesses want to keep up with growing consumer demands, they should consider how advances in robotics can improve productivity in fragile pick and place applications.
Global development and technology consultancy Cambridge Consultants has developed a robot that can emulate human touch. Hank uses sensors and soft grippers controlled by airflows. Each finger is controlled individually and responds to touch sensors. The fingers will locate the object, adjust their position and close around an object until they “feel” the product and grasp it.
Hank’s human-like senses allow him to pick small, irregular and delicate items without reprogramming. Hank can also apply increased force if it detects a slip, reducing the risk of breakages.
High tech start-up Wootzano has developed an electronic skin to give robots a sense of touch. Wootzkin has piezoelectric and piezoresistive sensing capabilities so that it can measure force and pressure and is also embedded with temperature sensors. This will give the robot feedback on force, temperature, pressure and humidity so that it can learn how to handle products as a human would.
A robot might never be able to feel in the same way that a human does but it does not prevent robots from completing tasks that require a human touch. New developments in robotics, such as Hank and Wootzkin allow manufacturers to improve dexterity on automated assembly lines and warehouses without compromising on productivity.
The content & opinions in this article are the author’s and do not necessarily represent the views of RoboticsTomorrow
This post does not have any comments. Be the first to leave a comment below.
Post A Comment
You must be logged in before you can post a comment. Login now.