From Seeker: The Pentagon may soon be unleashing a 21st-century version of locusts on its adversaries after officials on Monday said it had successfully tested a swarm of 103 micro-drones.
The important step in the development of new autonomous weapon systems was made possible by improvements in artificial intelligence, holding open the possibility that groups of small robots could act together under human direction.
Military strategists have high hopes for such drone swarms that would be cheap to produce and able to overwhelm opponents' defenses with their great numbers.
The test of the world's largest micro-drone swarm in California in October included 103 Perdix micro-drones measuring around six inches (16 centimeters) launched from three F/A-18 Super Hornet fighter jets, the Pentagon said in a statement.
"The micro-drones demonstrated advanced swarm behaviors such as collective decision-making, adaptive formation flying and self-healing," it said. Cont'd...
Rob Trice & Seana Day via Forbes: Last month as our Mixing Bowl colleagues Michael Rose and An Wang were interviewing Sonny Ranaswamy of the USDA’s NIFA to better understand current US food and agriculture labor issues, we were representing The Mixing Bowl in discussions on potential solutions to food production labor issues through automation and robotics.
At this year’s RoboUniverse event in San Diego there was a full-day track on December 14th dedicated to the application of robotics to agriculture. The industry track, pulled together in great part by Nathan Dorn, CEO of Food Origins and an Advisor to The Mixing Bowl, featured a knowledgeable group of automation/robotics experts and food producers who drew on their experience to define the opportunities and sharpen focus on the challenges. Nathan authored a detailed summary of the day in a post on Agfunder.
Our conclusion is that there is no denying that we are still in the early days of adoption of robotics in agriculture. Cont'd...
Tom Simonite for MIT Technology Review: Each of these trucks is the size of a small two-story house. None has a driver or anyone else on board.
Mining company Rio Tinto has 73 of these titans hauling iron ore 24 hours a day at four mines in Australia’s Mars-red northwest corner. At this one, known as West Angelas, the vehicles work alongside robotic rock drilling rigs. The company is also upgrading the locomotives that haul ore hundreds of miles to port—the upgrades will allow the trains to drive themselves, and be loaded and unloaded automatically.
Rio Tinto intends its automated operations in Australia to preview a more efficient future for all of its mines—one that will also reduce the need for human miners. The rising capabilities and falling costs of robotics technology are allowing mining and oil companies to reimagine the dirty, dangerous business of getting resources out of the ground. Cont'd...
Alan Boyle for GeekWire: If there are any Robin Hoods out there who are thinking about shooting down drones while they’re making deliveries, Amazon has a patented plan to stop you.
The patent, filed in 2014 but published just last week, lays out countermeasures for potential threats ranging from computer hacking to lightning flashes to bows and arrows.
If nothing else, the 33-page application illustrates how many things could possibly go wrong with an autonomous navigation system for unmanned aerial vehicles, or UAVs.
The “compromise system” that Amazon’s engineers propose relies on an array of sensors to orient the drone based on the sun’s position in the sky, if need be. That’s in case the drone gets confused by, say, lightning or a muzzle flash. Cont'd...
Steve Arar for All About Circuits: Recently, Vijay Kumar’s lab at the University of Pennsylvania in cooperation with researchers from Qualcomm has unveiled a quadrotor which can fly aggressively through a window. You may think that you have seen similar robots before; however, there is a big difference between previously designed robots and this new technology.
Generally, to exhibit challenging maneuvers, a quadrotor depends on an array of cameras mounted on the walls and some external processors. The image captured by the cameras is processed and the outcome is delivered to the robot. The computer can issue precise commands and the only thing that the robot needs to do is to follow the orders. However, the new robot performs both the image capturing and processing onboard.
The quadrotor carries an IMU, a Qualcomm Snapdragon, and Hexagon DSP. With the onboard sensors and processors, the robot is able to perform localization, state estimation, and path planning autonomously. Cont'd...
Alistair Blair for Bloomberg Technology: The word "robot" conjures images of bulky, metal humanoid objects moving awkwardly. Robotics veteran Rich Mahoney is trying to change that perception by creating a robotic exoskeleton people can wear.
After more than seven years running a robotics group at Silicon Valley research institution SRI International, Mahoney left about a year ago to form a startup called Superflex. On Tuesday, the company said it raised $9.6 million from investors including Japanese venture capital group Global Brain and Horizons Ventures, the VC fund of Asian billionaire Li Ka-shing.
Superflex is developing a lightweight suit with electric "muscles" that help the elderly and other less-mobile people move around. The system, which will look a bit like a unitard, is designed to provide the wearer with extra strength to get up from a chair or stand for longer. The device has thin actuators built in that use battery power to contract at the same time as people's real muscles. Cont'd...
From comma.ai: Last week, we open sourced an advanced driver assistance system in order to help accelerate the future of self driving cars and provide a platform anyone can build on top of. We released both openpilot, driving agent research software, and NEO, a robotics platform capable of running openpilot, under the MIT license.
openpilot is an open source adaptive cruise control and lane keeping assist system, both safety features available on modern cars. We would like to build the best ones on the market, and help you retrofit them to existing cars.
NEO is an open source robotics research platform. It is centered around an Android phone, similar to Android Based Robots. The modern smartphone is an incredible platform packed with sensors and processing power. NEO also includes a cooling solution and a CAN interface board. CAN is a networking protocol used in cars, trucks, power wheelchairs, golf carts, and many other robotics applications.
With a forthcoming openpilot release, it will become easier for researchers to add support for their own vehicle. On older cars, some actuators may be harder to control than others, but it should be very possible to control the gas electronically to have a gas only adaptive cruise control. It's also possible for researchers to add mechanical actuators for the controls that cannot be electronically actuated.
Glenn McDonald for Seeker: Want to know what drones of the future will look like?
So does David Lentink, editor of Interface Focus, a journal that, as its title suggests, looks at the interface of different scientific disciplines. Each issue zeroes in on a particular intersection of physical sciences and life sciences and invites the world's top scholars to publish their latest work.
The latest issue of Interface Focus brings together biologists and engineers to discuss a topic that's relatively straightforward and, well, pretty empirically cool:
"It's completely focused on how animals fly and how that can help us build flying robots," said Lentink, assistant professor of mechanical engineering at Stanford. Can't argue with that.
The new issue features 18 newly published papers on various ways that engineers are borrowing ideas from nature to make the next generation of drones and aerial robots. Several of the papers detail prototype drones that have already been built and tested. Cont'd...
From Alexander Shtuchkin:
Code & schematics for position tracking sensor using HTC Vive's Lighthouse system and a Teensy board.
- General purpose indoor positioning sensor, good for robots, drones, etc.
- 3d position accuracy: currently ~10mm; less than 2mm possible with additional work.
- Update frequency: 30 Hz
- Output formats: Text; Mavlink ATT_POS_MOCAP via serial; Ublox GPS emulation (in works)
- HTC Vive Station visibility requirements: full top hemisphere from sensor. Both stations need to be visible.
- Positioning volume: same as HTC Vive, approx up to 4x4x3 meters.
- Cost: ~$10 + Teensy 3.2 ($20) (+ Lighthouse stations (2x $135))
- Skills to build: Low complexity soldering; Embedded C++ recommended for integration to your project.
Records 16 to 30 of 344