The Future of Shopping is Here Today!
Steve Arar for All About Circuits: Recently, Vijay Kumar’s lab at the University of Pennsylvania in cooperation with researchers from Qualcomm has unveiled a quadrotor which can fly aggressively through a window. You may think that you have seen similar robots before; however, there is a big difference between previously designed robots and this new technology. Generally, to exhibit challenging maneuvers, a quadrotor depends on an array of cameras mounted on the walls and some external processors. The image captured by the cameras is processed and the outcome is delivered to the robot. The computer can issue precise commands and the only thing that the robot needs to do is to follow the orders. However, the new robot performs both the image capturing and processing onboard. The quadrotor carries an IMU, a Qualcomm Snapdragon, and Hexagon DSP. With the onboard sensors and processors, the robot is able to perform localization, state estimation, and path planning autonomously. Cont'd...
Alistair Blair for Bloomberg Technology: The word "robot" conjures images of bulky, metal humanoid objects moving awkwardly. Robotics veteran Rich Mahoney is trying to change that perception by creating a robotic exoskeleton people can wear. After more than seven years running a robotics group at Silicon Valley research institution SRI International, Mahoney left about a year ago to form a startup called Superflex. On Tuesday, the company said it raised $9.6 million from investors including Japanese venture capital group Global Brain and Horizons Ventures, the VC fund of Asian billionaire Li Ka-shing. Superflex is developing a lightweight suit with electric "muscles" that help the elderly and other less-mobile people move around. The system, which will look a bit like a unitard, is designed to provide the wearer with extra strength to get up from a chair or stand for longer. The device has thin actuators built in that use battery power to contract at the same time as people's real muscles. Cont'd...
Endeavor Robotics Submits Proposal Response as Prime System Integrator (PSI) for the Advanced Explosive Ordnance Disposal Robotic System (AEODRS) Increments 2 & 3
Endeavor Robotics (FKA iRobot Defense & Security), the United States-based leader in ground robotics for defense, first responders, and the nuclear industry, submitted a formal proposal response for the AEODRS Increment 2 Tactical Operations System (TOS) & Increment 3 Base/Infrastructure Operations (BIOS) Variants.
Bodkin Design Unveils Unique Snapshot Hyperspectral Products Through Exclusive Partnership With Cubert
Bodkin Design and Engineering, LLC will team with Cubert Gmbh at Photonics West 2017 for a presentation and product demo to showcase their patented snapshot spectral imaging for precision agriculture, medical and biotech, machine vision, and more.
Open Source Robotics and IoT Framework Releases Version 1.0
TraceParts challenged its engineering community to take part in a mystery word competition
Because LiDAR uses light, the target must be visible, so it is not an all-weather solution. It won't work well in fog or other weather conditions that affect visibility, but if conditions are clear, it can operate during both day and night.
From comma.ai: Last week, we open sourced an advanced driver assistance system in order to help accelerate the future of self driving cars and provide a platform anyone can build on top of. We released both openpilot, driving agent research software, and NEO, a robotics platform capable of running openpilot, under the MIT license. openpilot is an open source adaptive cruise control and lane keeping assist system, both safety features available on modern cars. We would like to build the best ones on the market, and help you retrofit them to existing cars. NEO is an open source robotics research platform. It is centered around an Android phone, similar to Android Based Robots. The modern smartphone is an incredible platform packed with sensors and processing power. NEO also includes a cooling solution and a CAN interface board. CAN is a networking protocol used in cars, trucks, power wheelchairs, golf carts, and many other robotics applications. With a forthcoming openpilot release, it will become easier for researchers to add support for their own vehicle. On older cars, some actuators may be harder to control than others, but it should be very possible to control the gas electronically to have a gas only adaptive cruise control. It's also possible for researchers to add mechanical actuators for the controls that cannot be electronically actuated. Have fun, be safe, and let's usher in the future of self driving cars together... (Github repo) (Interview)
The contest will inevitably take up a large area of the circuit since it will consist of three challenges and a triathlon type Grand Challenge. Challenge 1 will require teams to use an Unmanned Aerial Vehicle (UAV) to locate, track and land on a moving vehicle. This means that the UAV must be capable of locating the moving vehicle, navigating the path and speed of the vehicle and land on the target location while the vehicle is in motion.
Optelos Launches Turnkey Solution for Drone Workflow, Solving Key Challenges Holding Back Drone Rrograms
Optelos' turnkey approach is setting the standard for securely managing drone work activities and data content.
CES 2017 in Las Vegas Brings AutonmouStuff and NEXCOM Together to Innovate Automated Driving Solutions
Glenn McDonald for Seeker: Want to know what drones of the future will look like? So does David Lentink, editor of Interface Focus, a journal that, as its title suggests, looks at the interface of different scientific disciplines. Each issue zeroes in on a particular intersection of physical sciences and life sciences and invites the world's top scholars to publish their latest work. The latest issue of Interface Focus brings together biologists and engineers to discuss a topic that's relatively straightforward and, well, pretty empirically cool: "It's completely focused on how animals fly and how that can help us build flying robots," said Lentink, assistant professor of mechanical engineering at Stanford. Can't argue with that. The new issue features 18 newly published papers on various ways that engineers are borrowing ideas from nature to make the next generation of drones and aerial robots. Several of the papers detail prototype drones that have already been built and tested. Cont'd...
From Alexander Shtuchkin: Code & schematics for position tracking sensor using HTC Vive's Lighthouse system and a Teensy board. General purpose indoor positioning sensor, good for robots, drones, etc. 3d position accuracy: currently ~10mm; less than 2mm possible with additional work. Update frequency: 30 Hz Output formats: Text; Mavlink ATT_POS_MOCAP via serial; Ublox GPS emulation (in works) HTC Vive Station visibility requirements: full top hemisphere from sensor. Both stations need to be visible. Positioning volume: same as HTC Vive, approx up to 4x4x3 meters. Cost: ~$10 + Teensy 3.2 ($20) (+ Lighthouse stations (2x $135)) Skills to build: Low complexity soldering; Embedded C++ recommended for integration to your project. (Github page)
The insect drone takes on the functions of larger UAVs, but reduces the larger drones down into a miniature undetectable device.
Records 1291 to 1305 of 3791
A brushless DC motor solution for use in hip and knee exoskeletons. This complete joint actuation unit consists of motor, gearhead, encoder and position controller. Fitting absolute encoder directly at the joint rotation provides designers increased positioning accuracy. The unit will deliver 54Nm of continuous torque and 120Nm on a 20% duty cycle. The system can be operated on supplies between 10 and 50V DC and the actuation speed is up to 22rpm.