Glenn McDonald for Seeker: Want to know what drones of the future will look like?
So does David Lentink, editor of Interface Focus, a journal that, as its title suggests, looks at the interface of different scientific disciplines. Each issue zeroes in on a particular intersection of physical sciences and life sciences and invites the world's top scholars to publish their latest work.
The latest issue of Interface Focus brings together biologists and engineers to discuss a topic that's relatively straightforward and, well, pretty empirically cool:
"It's completely focused on how animals fly and how that can help us build flying robots," said Lentink, assistant professor of mechanical engineering at Stanford. Can't argue with that.
The new issue features 18 newly published papers on various ways that engineers are borrowing ideas from nature to make the next generation of drones and aerial robots. Several of the papers detail prototype drones that have already been built and tested. Cont'd...
From Alexander Shtuchkin:
Code & schematics for position tracking sensor using HTC Vive's Lighthouse system and a Teensy board.
- General purpose indoor positioning sensor, good for robots, drones, etc.
- 3d position accuracy: currently ~10mm; less than 2mm possible with additional work.
- Update frequency: 30 Hz
- Output formats: Text; Mavlink ATT_POS_MOCAP via serial; Ublox GPS emulation (in works)
- HTC Vive Station visibility requirements: full top hemisphere from sensor. Both stations need to be visible.
- Positioning volume: same as HTC Vive, approx up to 4x4x3 meters.
- Cost: ~$10 + Teensy 3.2 ($20) (+ Lighthouse stations (2x $135))
- Skills to build: Low complexity soldering; Embedded C++ recommended for integration to your project.
Written by AZoRobotics: Most robots achieve grasping and tactile sensing through motorized means, which can be excessively bulky and rigid. A Cornell group has devised a way for a soft robot to feel its surroundings internally, in much the same way humans do.
A group led by Robert Shepherd, assistant professor of mechanical and aerospace engineering and principal investigator of Organic Robotics Lab, has published a paper describing how stretchable optical waveguides act as curvature, elongation and force sensors in a soft robotic hand.
Doctoral student Huichan Zhao is lead author of “Optoelectronically Innervated Soft Prosthetic Hand via Stretchable Optical Waveguides,” which is featured in the debut edition of Science Robotics. The paper published Dec. 6; also contributing were doctoral students Kevin O’Brien and Shuo Li, both of Shepherd’s lab. Cont'd...
Visteon's Silicon Valley Technical Center to Lead Development of Artificial Intelligence for Autonomous Vehicles
Evan Ackerman for IEEE Spectrum: As sensors, computers, actuators, and batteries decrease in size and increase in efficiency, it becomes possible to make robots much smaller without sacrificing a whole lot of capability. There’s a lower limit on usefulness, however, if you’re making a robot that needs to interact with humans or human-scale objects. You can continue to leverage shrinking components if you make robots that are modular: in other words, big robots that are made up of lots of little robots.
In some ways, it’s more complicated to do this, because if one robot is complicated, robots tend to be complicated. If you can get all of the communication and coordination figured out, though, a modular system offers tons of advantages: robots that come in any size you want, any configuration you want, and that are exceptionally easy to repair and reconfigure on the fly.
MIT’s ChainFORM is an interesting take on this idea: it’s an evolution of last year’s LineFORM multifunctional snake robot that introduces modularity to the system, letting you tear of a strip of exactly how much robot you need, and then reconfigure it to do all kinds of things. Cont'd...
Milrem: In the Foreseeable Future Soldiers in Most Dangerous Situations Will Be Replaced by Smart Robotic Systems
Records 46 to 60 of 2534