Endeavor Robotics Submits Proposal Response as Prime System Integrator (PSI) for the Advanced Explosive Ordnance Disposal Robotic System (AEODRS) Increments 2 & 3
Endeavor Robotics (FKA iRobot Defense & Security), the United States-based leader in ground robotics for defense, first responders, and the nuclear industry, submitted a formal proposal response for the AEODRS Increment 2 Tactical Operations System (TOS) & Increment 3 Base/Infrastructure Operations (BIOS) Variants.
Bodkin Design Unveils Unique Snapshot Hyperspectral Products Through Exclusive Partnership With Cubert
Bodkin Design and Engineering, LLC will team with Cubert Gmbh at Photonics West 2017 for a presentation and product demo to showcase their patented snapshot spectral imaging for precision agriculture, medical and biotech, machine vision, and more.
Open Source Robotics and IoT Framework Releases Version 1.0
TraceParts challenged its engineering community to take part in a mystery word competition
Because LiDAR uses light, the target must be visible, so it is not an all-weather solution. It won't work well in fog or other weather conditions that affect visibility, but if conditions are clear, it can operate during both day and night.
From comma.ai: Last week, we open sourced an advanced driver assistance system in order to help accelerate the future of self driving cars and provide a platform anyone can build on top of. We released both openpilot, driving agent research software, and NEO, a robotics platform capable of running openpilot, under the MIT license. openpilot is an open source adaptive cruise control and lane keeping assist system, both safety features available on modern cars. We would like to build the best ones on the market, and help you retrofit them to existing cars. NEO is an open source robotics research platform. It is centered around an Android phone, similar to Android Based Robots. The modern smartphone is an incredible platform packed with sensors and processing power. NEO also includes a cooling solution and a CAN interface board. CAN is a networking protocol used in cars, trucks, power wheelchairs, golf carts, and many other robotics applications. With a forthcoming openpilot release, it will become easier for researchers to add support for their own vehicle. On older cars, some actuators may be harder to control than others, but it should be very possible to control the gas electronically to have a gas only adaptive cruise control. It's also possible for researchers to add mechanical actuators for the controls that cannot be electronically actuated. Have fun, be safe, and let's usher in the future of self driving cars together... (Github repo) (Interview)
The contest will inevitably take up a large area of the circuit since it will consist of three challenges and a triathlon type Grand Challenge. Challenge 1 will require teams to use an Unmanned Aerial Vehicle (UAV) to locate, track and land on a moving vehicle. This means that the UAV must be capable of locating the moving vehicle, navigating the path and speed of the vehicle and land on the target location while the vehicle is in motion.
Optelos Launches Turnkey Solution for Drone Workflow, Solving Key Challenges Holding Back Drone Rrograms
Optelos' turnkey approach is setting the standard for securely managing drone work activities and data content.
CES 2017 in Las Vegas Brings AutonmouStuff and NEXCOM Together to Innovate Automated Driving Solutions
Glenn McDonald for Seeker: Want to know what drones of the future will look like? So does David Lentink, editor of Interface Focus, a journal that, as its title suggests, looks at the interface of different scientific disciplines. Each issue zeroes in on a particular intersection of physical sciences and life sciences and invites the world's top scholars to publish their latest work. The latest issue of Interface Focus brings together biologists and engineers to discuss a topic that's relatively straightforward and, well, pretty empirically cool: "It's completely focused on how animals fly and how that can help us build flying robots," said Lentink, assistant professor of mechanical engineering at Stanford. Can't argue with that. The new issue features 18 newly published papers on various ways that engineers are borrowing ideas from nature to make the next generation of drones and aerial robots. Several of the papers detail prototype drones that have already been built and tested. Cont'd...
From Alexander Shtuchkin: Code & schematics for position tracking sensor using HTC Vive's Lighthouse system and a Teensy board. General purpose indoor positioning sensor, good for robots, drones, etc. 3d position accuracy: currently ~10mm; less than 2mm possible with additional work. Update frequency: 30 Hz Output formats: Text; Mavlink ATT_POS_MOCAP via serial; Ublox GPS emulation (in works) HTC Vive Station visibility requirements: full top hemisphere from sensor. Both stations need to be visible. Positioning volume: same as HTC Vive, approx up to 4x4x3 meters. Cost: ~$10 + Teensy 3.2 ($20) (+ Lighthouse stations (2x $135)) Skills to build: Low complexity soldering; Embedded C++ recommended for integration to your project. (Github page)
The insect drone takes on the functions of larger UAVs, but reduces the larger drones down into a miniature undetectable device.
Under the FAA Pathfinder Program, PrecisionHawk's Phase 2 research indicates technology assist is critical for BVLOS operations.
Agreement brings additional, complete UAV solution packages to RDO Equipment Co. Customers
Written by AZoRobotics: Most robots achieve grasping and tactile sensing through motorized means, which can be excessively bulky and rigid. A Cornell group has devised a way for a soft robot to feel its surroundings internally, in much the same way humans do. A group led by Robert Shepherd, assistant professor of mechanical and aerospace engineering and principal investigator of Organic Robotics Lab, has published a paper describing how stretchable optical waveguides act as curvature, elongation and force sensors in a soft robotic hand. Doctoral student Huichan Zhao is lead author of “Optoelectronically Innervated Soft Prosthetic Hand via Stretchable Optical Waveguides,” which is featured in the debut edition of Science Robotics. The paper published Dec. 6; also contributing were doctoral students Kevin O’Brien and Shuo Li, both of Shepherd’s lab. Cont'd.. .
Records 841 to 855 of 3338
Zaber's X-LRQ-DE Series of linear stages have high stiffness, load, and lifetime capabilities in a compact size. The integrated linear encoder combined with stage calibration provides high accuracy positioning over the full travel of the device. At 36 mm high, these stages are excellent for applications where a low profile is required. The X-LRQ-DE's innovative design allows speeds up to 205 mm/s and loads up to 100 kg. Like all Zaber products, the X-LRQ-DE Series is designed for easy set-up and operation.