Unlike other solutions, AImotive's full stack software uses the power of artificial intelligence to “see” fine detail and predict behavior, making it easier to manage common driving concerns such as poor visibility and adverse conditions.

AI-powered Motion

Niko Eiden | AImotive

 

Who is AImotive?

AImotive, formerly AdasWorks, is the leader in AI-powered motion. AImotive delivers a full stack technology solution and powerful Artificial Intelligence software for the automotive industry, designed to provide self-driving vehicles better safety and increased productivity.

 

How does your self-driving technology work?

AImotive products deliver the robust technology required to operate self-driving vehicles in all conditions, and can be adapted to different driving styles and cultures. AImotive enables OEMs to move faster and more efficiently into fully autonomous car production. The AImotive suite of products includes:

  1. aiDrive: An in-car technology stack comprised of a Recognition Engine, Location Engine, Motion Engine and Control Engine. Recognition Engine combines and analyses sensor data with AImotive’s pixel-precise segmentation tool, designed to recognize up to 100 different object classes such as pedestrians, bicycles, animals, buildings and obstacles. Location Engine provides a globally-scalable solution for precise self-localization of the vehicle, and it can also operate without HD maps by using  landmark point data on top of conventional GPS positioning. Motion Engine enables real-time tracking of moving objects by predicting their future location and behavior, allowing for optimal routing of the car even in emergency situations. Control Engine is the execution component that manages primary functions such as acceleration, braking, steering, etc., as well as all auxiliary functions such as turn signals, headlights and the car horn.
  2. aiKit: Incorporates a complete ecosystem of tools that accelerates the development and operation of the aiDrive software. It consist of tools used for training the AI, which enable faster data collection and data preparation, complete testing with real-time simulation of various driving conditions, and ensures tested, over-the-air updates for maximum travel safety.
  3. aiWare: Performance-efficient hardware reference designs for automotive solutions, offering a significantly reduced power consumption by utilizing a novel architecture which aggressively shifts towards high bandwidth and low latency Neural Network (NN) computation. While the goal of aiWare is to help bring truly efficient AI hardware to the market quicker, aiDrive itself is designed to be processor agnostic, allowing for seamless integration with GPU, FPGA or embedded technology based systems.

 

What makes AImotive's technology different from what else is on the market?

AImotive offers an architecture capable of robust scalability across the global market, by utilizing cameras as the primary sensors for greater affordability and accessibility. Unlike other solutions, AImotive's full stack software uses the power of artificial intelligence to “see” fine detail and predict behavior, making it easier to manage common driving concerns such as poor visibility and adverse conditions. AImotive's training technique is also scalable with a real-time simulator tool that trains the AI for a wide variety of traffic scenarios. AImotives camera-first approach allows a straightforward integration with other sensors as well, such as LIDAR and radar, to further ensure safety and reliability of its object detection.

 

Can the software be used on a variety of vehicles or is it designed specifically for automobiles?

The software is vendor- and hardware-agnostic, created purposefully to be scalable and accessible for a wide variety of vehicles, climates and cultures around the world. It can be used for various levels of autonomous operation in cars, trucks, buses, rail and agriculture/construction vehicles.

 

What hardware is required for the system to function?

Assuming a vehicle is already fly-by-wire equipped (ie: primary functions like steering, braking, etc. are controlled with electronics) the vehicle will need detection hardware in the form of 6-12 cameras offering a detailed surround view and an object detection backup system consisting of LIDAR or radars. The sensors will need to be connected in a centralized architecture to a computing platform capable of processing the sensor inputs to the aiDrive software in real-time.

 

How soon do you think you will see the system available for public use and what needs to happen before that becomes a reality?

AImotive is taking a step-by-step approach in releasing features for public use to ensure safety.  We expect the first aiDrive components to focus in object detection for rear, front and surround use cases, and to be available in a few years for the public, followed by the first autonomous driving features like highway autopilot. Full autonomous driving will still require a significant amount of testing and associated legislation updates, and the timing is therefore impossible to predict.

 

What are your plans following the recent U.S. expansion?

Our vision is to bring global accessibility to self-driving vehicles, faster and safer than any other company in the world. Beyond our U.S. offices, we also are looking to expand in additional locations - most likely Scandinavia and East Asia to start.  

 

What do you see for the future of autonomous driving?

Autonomous driving is where the automobile market is heading, and companies that will stand out as leaders in the space are those who will find a way to make the technology affordable and accessible across the varied markets and regions around the world. At AImotive, we firmly believe that an industry standard, which works across multiple platforms, will be beneficial for the whole market. We value openness over secrecy, sharing our vision of AI-powered vehicles with the world. If the industry can work together, the world will more efficiently be able to move toward the future of everyday mobility.

 

About Niko Eiden, Global COO, AImotive
Niko currently serves as Global COO of AImotive, the leader in AI-powered motion, where he oversees global expansion and sales activities for the company. Niko brings particular strength to this role, seamlessly combining business and technology to better create collaborative relationships that play to the strengths of AImotive's fast-moving startup mode as well as the resources and industry expertise of its large, multi-national customers. In addition to his role at AImotive, this summer he co-founded a new startup in the AR/VR space. Prior to joining AImotive, Niko was responsible for all future mobile technologies at Microsoft, and  has also held various non-technical roles in marketing, business development and strategy. In his free time he enjoys building and operating his own aircraft and human powered vehicles. He is a true world citizen who has lived, studied and worked all around the world from Asia to America.

The content & opinions in this article are the author’s and do not necessarily represent the views of RoboticsTomorrow

Comments (0)

This post does not have any comments. Be the first to leave a comment below.


Post A Comment

You must be logged in before you can post a comment. Login now.

Featured Product

Zaber LC40 Non Motorized Linear Stage Gantry Systems

Zaber LC40 Non Motorized Linear Stage Gantry Systems

A Zaber gantry kit comes with everything you need to build a customized XY gantry system or XYZ gantry system. These gantry systems feature coordinated multi-axis motion, plug-and-play operation, easy integration with end-effector options, and built-in IO and E-Stop capabilities. An intuitive ASCII interface allows the user to easily communicate with the gantry systems using our free software, either Zaber Motion Library with APIs for several popular languages or Zaber Console. Third party terminal programs that can communicate over a serial port can also be used.