CEO Jeff Linnell brings an unorthodox approach to robotics. Part Cinematographer, part Creative Director and part self-taught Engineer, Jeff is a serial robotics entrepreneur and a thought leader.

Advancing AI to Get Robots to Work With Humans
Advancing AI to Get Robots to Work With Humans

Jeff Linnell, CEO | Formant

Tell us about Formant and your role with the company.

I am the CEO and Founder of Formant.  We enable businesses to automatically collect, analyze, and act on robotics data. Specifically, we enable operators to keep robots active in the world. 

We believe that the future of robotics is one where you have one operator managing hundreds or thousands of robots. With our platform, operators at robotics companies become superhumans. We’ve reduced the cognitive load to a point where they can react to issues as they come up, all while performing analysis to reduce the overall number of issues. 

 

You recently spoke at RoboBusiness about how you feel the future is in robot/human collaboration, not full autonomy.  What is the reasoning behind your opinion?

Only recently has the robotics industry reached a point where vision, machine intelligence, and hardware are good enough to get a robot to 90% reliability in a semi-structured environment. 

I speak to a lot of companies in our field, and they all tell me the same thing - a 90% reliable robot in a fleet of thousands is 0% useful. To get to the very high reliability, in short order, we believe humans are part of the equation. 

A robot might, for example, be unsure about the objects in their vision, so they need instruction to go left or right. The human in the loop deals with these edge cases and does the things that robots aren’t good at - solving problems that require creativity and intuition. 

Autonomy will improve, but I don’t foresee us getting to a point where robots in unpredictable environments can operate without humans in the next 10 years. 

 

What types of applications and environments do you see robots operating in the future?

The big shiny thing in robotics right now is the self-driving car. It is going to be a very long time before we take a ride in a self-driving car every day. Instead what I’m seeing is robots operating in semi-structured environments doing boring tasks, like cleaning floors, delivering medicines between hospital rooms, or assisting trades in construction. 

Companies are taking rifle shots at these very specific applications. They aren’t even thinking of themselves as “robotics companies”, but as companies automating a specific task. The goal here is to find applications where a robot can be more efficient or effective than manual labor. As a result, there’s an explosion of companies doing these very simple things that are very difficult for machines but very easy for humans. 

 

Editors Recommendation "Companies Are Buying Services Instead of Robots"

 

How do we advance from cobots operating at a 1 human to 1 robot ratio to 1 human operating 10 robots?

I don’t think this is future advancement - it’s already here. We’re already seeing companies deploying large numbers of robots managed by a centralized operations team. These companies - and there aren’t that many of them - managed to overcome the problem by building software stacks from scratch. 

The deficiency right now is that software is lagging the robotics hardware. It’s fairly easy to buy hardware components off the shelf to see and manipulate objects or move throughout the world. There’s not a robust software ecosystem that supports managing large fleets. I believe the creation of this ecosystem will fuel the next generation of robotics, dramatically reducing the cost and time to deploy a robot fleet.

 

Companies are increasingly going to be managing larger fleets. What obstacles and roadblocks do you expect them to hit as they scale?

We talk a lot at Formant about what it takes to scale robotics companies. Generally early-on companies decide to bootstrap much of their own technical infrastructure in-house. Then they get their first big order, their first customer. 

Suddenly they have to “flip the ratio” from 10 engineers building 1 robot, to 1 engineer managing 10s or 100s of robots. This requires the company to ingest data from their entire fleet in uncertain network environments, move it to a place where it can be analyzed, and visualize it so it’s easy to make sense of this data. Doing this in an efficient and scalable way is a technical challenge. It’s even harder when your entire team isn’t used to building cloud-first software. 

These companies are, for the first time, creating new, highly advanced technology and new processes to manage and make sense of that technology. In a sense they are creating entirely new disciplines - robotics operators, cloud roboticists, devops for robots - all of these are roles that didn’t exist 5 years ago. The talent pool for these disciplines is still small, which means for the foreseeable future it won’t be easy to recruit and train employees.

 

What trends are exciting in the industry? What do you think has been oversold?

I’m loving that you are starting to see B2B companies take off that are solving very specific problems. Companies are showing that a robot plus a human can be 20x as efficient as either of them individually. I can point to seven or eight companies in our direct San Francisco neighborhood that fit this mold. To me, this is a giant signal that B2B robotics is finally starting to get some traction.

The promise and utility of personal robotics has been oversold. Beyond their novelty, the usefulness of an assistive robot is questionable. For example, I couldn’t clearly articulate the benefit of a mobile Amazon Echo or Robot that could follow me around and present information by projecting a video on a wall. We have super computers in our pockets that are always with us. What’s the benefit of actuation?   This is why I’m excited by the pragmatism of B2B applications. Real work is being done by robotic “Tools” right now. Largely in coordination with human operators and co-workers. This is a future I will bet on. 

 

What innovations are allowing AI to work better independently and with humans? 

AI and machine learning have advanced incredibly in the last few years. The community has open access to new technologies and infrastructure like Tensor Flow, a machine learning library that makes it really easy to build and iterate on ML models. It’s poised to unlock incredible value in the ML community. 

However, any machine learning expert will tell you - a model is only as good as the data underlying it. You are just starting to see systems, like Formant’s, that automatically collect, ingest, store, and display robotics data in human-readable ways. I believe this is absolutely necessary to both improve and tune the ML algorithms and allow humans to observe decisions made by AI. To date a lot of ML algorithms and models have been largely opaque. Observability into the data is paramount to understanding it. 

 

Why is it important for robotics companies to be thinking about analytics and business intelligence?

Analytics is the lifeblood of a robotics company. Whether they want to diagnose a problem, prove value to investors, or sell their product through performance data, they need analytics. 

What we’re seeing now is that a good analytics stack that automates the creation of insights is table stakes to build a scaled robotics company. Every employee needs access to insights - from engineers debugging product to operators improving the uptime of the fleet. I’m noticing a strong trend - companies that invest and build an analytics stack early in their lifecycle have an easier time scaling than those who do not. 

 

Can you give us an idea on what we might see from Formant over the next 2-3 years?

Formant has built a full stack to enable humans to collect, analyze and act on robotics data. 

This is just the beginning. We are continuing to add value via ML services on top of this data to provide insights, illuminate anomalies, reduce human cognitive load heuristically and ensure compliance with regulatory environments (GDPR for example).  Step 1) Make it easy to acquire data. Step 2) Make the data smarter. 

There is an incredible amount of underleveraged data in the world today accelerated by the ubiquity of data networks mobile devices spewing off huge amounts of digital exhaust. Robots (rolling sensor packages with ample compute) are about to geometrically expand this data set. The true value and opportunity is to make this data digestible, understandable and human readable. Everything we are working on is in service of this goal. 

 

If you like this article you may like "Adopting Service Robot Technology"

The content & opinions in this article are the author’s and do not necessarily represent the views of RoboticsTomorrow

Comments (0)

This post does not have any comments. Be the first to leave a comment below.


Post A Comment

You must be logged in before you can post a comment. Login now.

Featured Product

BitFlow Introduces 6th Generation Camera Link Frame Grabber: The Axion

BitFlow Introduces 6th Generation Camera Link Frame Grabber: The Axion

BitFlow has offered a Camera Link frame grabbers for almost 15 years. This latest offering, our 6th generation combines the power of CoaXPress with the requirements of Camera Link 2.0. Enabling a single or two camera system to operate at up to 850 MB/S per camera, the Axion-CL family is the best choice for CL frame grabber. Like the Cyton-CXP frame grabber, the Axion-CL leverages features such as the new StreamSync system, a highly optimized DMA engine, and expanded I/O capabilities that provide unprecedented flexibility in routing. There are two options available; Axion 1xE & Axion 2xE. The Axion 1xE is compatible with one base, medium, full or 80-bit camera offering PoCL, Power over Camera Link, on both connectors. The Axion 2xE is compatible with two base, medium, full or 80-bit cameras offering PoCL on both connectors for both cameras. The Axion-CL is a culmination of the continuous improvements and updates BitFlow has made to Camera Link frame grabbers.