Embedded Vision Alliance Announces 2018 Vision Tank Finalists
Finalists Represent Excellence in Computer Vision Innovation
WALNUT CREEK, Calif., April 23, 2018 /PRNewswire/ -- The Embedded Vision Alliance® today announced the finalists for the 2018 Vision Tank, the Embedded Vision Summit's annual start-up competition that showcases the best new ventures using computer vision in their products. Each finalist will pitch their company and product to judges in front of the Summit audience on May 23, 2018, with winners announced at the end of the session.
This year's Vision Tank finalists are:
AiFi is building a scalable version of "Amazon Go" to empower stores of the future to be check-out free. AiFi's innovative AI-powered sensor networks also provide retailers with valuable insights about shopping behavior and product preference, as well as improved inventory management.
Aquifi provides visual inspection services for logistics and manufacturing, based on the combination of 3D reconstruction and deep learning. The company's solution, a trainable virtual inspection system, increases the throughput of human workers and reduces errors due to fatigue and repetition.
Boulder AI has created a very intelligent camera that is waterproof, dust-proof and runs AI at the image source. This edge processing camera runs AI/machine learning and computer vision algorithms without the cloud, distilling visual information into actionable data.
Sturfee technology is built on deep learning, computer vision and satellite imaging principles, enabling devices and machines to use visual data as input to precisely locate themselves in the real world, identify where they are looking and recognize what is around them.
VirtuSense Technologies' product identifies people who are at risk of falls and injuries. The core technology is based on machine vision, using a 3D time-of-flight sensor to track a person's static and dynamic balance, identify sensory and muscular deficits and provide objective data to assess and treat issues.
"This year we received a record number of Vision Tank entrants. Our panel judged these start-ups on technical innovation, business plan, team and business opportunity," said Jeff Bier, founder of the Embedded Vision Alliance. "These five finalists have distinguished themselves by tackling meaningful, real-world problems with fresh, promising approaches and talented teams."
About the 2018 Embedded Vision Summit
The Embedded Vision Summit is being held May 21-24, 2018, at the Santa Clara Convention Center. The Summit is the only event focused exclusively on deployable computer vision, attracting a global audience of companies developing vision-enabled products, both at the edge and in the cloud. The 2018 Embedded Vision Summit will feature more than 90 presentations and showcase more than 100 technology demos, as well as host a variety of technical workshops and training classes. The sponsors announced to date for the 2018 Summit are Aimotive, Allied Vision Technologies, ARM, BDTI, Cadence Design Systems, Horizon Robotics, Intel, Lattice Semiconductor, Luum, Nextchip, Novumind, NXP Semiconductors, Synopsys and Xilinx. For the latest updates on the Embedded Vision Summit, follow@EmbVisionSummit on Twitter.
About The Embedded Vision Alliance
The Embedded Vision Alliance is a worldwide industry partnership bringing together technology providers and end-product companies who are enabling innovative and practical applications for computer vision for a range of market segments and applications, including automotive, consumer electronics, gaming, imaging, and more. Membership is open to any company that supplies or uses technology for computer vision systems and applications. For more information on the Alliance, visit https://www.embedded-vision.com.
Featured Product
Robotmaster® 2024
Program multi-robot cells and automatically solve robotic errors with ease. Hypertherm Associates announces a new version to its robotic programming software. Robotmaster 2024 addresses key market trends including the support for programming multiple robots in a single work cell and the demand for automatic trajectory optimization and robotic error correction.