The growing presence of AI powered everything in our personal lives is quickly creating an enormous population of people who are not only comfortable using voice to operate technology but prefer it.

The Future of “Cobots” is AI Voice Interface at the Edge
The Future of “Cobots” is AI Voice Interface at the Edge

Nicolas Baron | Snips

The market for cobots (collaborative robots) is expected to grow more than $10 Billion and make up 34% of all manufacturing robot sales by 2025 according to the Robotics Industries Association. This is a significant shift as cobots only make up a very small percentage of the current market, which is dominated by robots designed for a specific task to do in isolation from humans. Although this move towards robots and humans working side-by-side seems to further solidify the need for manufacturing jobs, it does put pressure on organizations to find and/or train employees to be able to operate increasingly complex equipment. To address this challenge, the industry is likely to move towards a greater reliance on an edge AI voice interface for two key reasons.

 

The convenience of voice interface

Over 8 billion digital voice assistants are expected to be in use by 2023 according to Juniper Research, up from the 2.5 billion that were used at the end of 2018. The growing presence of AI powered everything in our personal lives is quickly creating an enormous population of people who are not only comfortable using voice to operate technology but prefer it. According to the Institute of Mechanical Engineers, a skills shortage – specifically around technology – is seen as one of the largest hurdles in the manufacturing space. When done well, AI voice interface simplifies technology and provides a lower usage entry point for otherwise complex technology.

In addition to providing a simpler interface, its convenience is able to improve worker safety. Some factories are already using voice interface such as when repairing equipment, so employees are able to leave their safety gloves on and take repair notes audibly when working to fix a damaged piece of equipment. The value is only enhanced when applied to cobots. Factory workers will be able to keep both hands on their task as they direct their robot helpers with their voice or receive live feedback on what their robots are seeing or doing.

 

Edge capabilities have improved

Although an AI voice interface has become a regular part of consumer technology interaction, the factory floor is a very different environment and there are good reasons it has not yet become more common place. Amazon and Google currently dominate the consumer AI voice assistant market, but the way their technology operates is incompatible with the needs of a modern factory. Alexa and Google Home both require an internet connection so that their respective voice assistants are able to connect to the cloud. User interactions are stored in the cloud allowing the voice assistants to learn from user interactions and progressively improve. However, this creates two problems, the very publicly discussed privacy concerns and latency, defined as a delay between telling your voice assistant to do something and that action taking place.

Both of these issues are major concerns for factories, which want to ensure that proprietary information is not leaked and cannot afford to deal with reoccurring latency issues. However, both concerns can be addressed by bringing AI voice assistance to the edge. Edge computing refers to applications, services, and processes performed outside of a central data center and closer to the end users. In simple terms, edge devices do not require an internet connection to function and instead do the data processing on device. Because the data is stored on-device, latency is mitigated and trade secrets are secure as data never leaves manufacturers’ networks. Additionally, capabilities do not have to be sacrificed when using voice interface at the edge. Recent technology advancements like neuromorphic chips, which were inspired by the human brain, have made it possible to embed complicated AI without the need for the cloud’s compute power.

AI voice assistance is already beginning to appear on the factory floor and its importance will only grow as factories become more complex and greater collaboration is needed between robots and humans. Voice AI at the edge is able to simplify technology, making it accessible to workers who may not be “tech fluent”; improve worker safety; and with recent technology advancements, function in environments even when network connectivity is not reliable. With the recent advances in edge capabilities, it is only a matter of time before talking to robot helpers on the factory floor is all too common.

 

 

About Nicolas Baron
Nicolas Baron is a Business Development Director at Snips. He joined Snips from IBM where he worked in delivery business units, then as a product manager, and more recently in a long cycle in the sales organisation, being responsible for generating new business opportunities in the IT infrastructure. Nicolas holds an MBA degree from the University of Warwick.

 

The content & opinions in this article are the author’s and do not necessarily represent the views of RoboticsTomorrow

Comments (0)

This post does not have any comments. Be the first to leave a comment below.


Post A Comment

You must be logged in before you can post a comment. Login now.

Featured Product

Robotmaster® 2024

Robotmaster® 2024

Program multi-robot cells and automatically solve robotic errors with ease. Hypertherm Associates announces a new version to its robotic programming software. Robotmaster 2024 addresses key market trends including the support for programming multiple robots in a single work cell and the demand for automatic trajectory optimization and robotic error correction.