Because LiDAR uses light, the target must be visible, so it is not an all-weather solution. It wont work well in fog or other weather conditions that affect visibility, but if conditions are clear, it can operate during both day and night.
From comma.ai: Last week, we open sourced an advanced driver assistance system in order to help accelerate the future of self driving cars and provide a platform anyone can build on top of. We released both openpilot, driving agent research software, and NEO, a robotics platform capable of running openpilot, under the MIT license. openpilot is an open source adaptive cruise control and lane keeping assist system, both safety features available on modern cars. We would like to build the best ones on the market, and help you retrofit them to existing cars. NEO is an open source robotics research platform. It is centered around an Android phone, similar to Android Based Robots. The modern smartphone is an incredible platform packed with sensors and processing power. NEO also includes a cooling solution and a CAN interface board. CAN is a networking protocol used in cars, trucks, power wheelchairs, golf carts, and many other robotics applications. With a forthcoming openpilot release, it will become easier for researchers to add support for their own vehicle. On older cars, some actuators may be harder to control than others, but it should be very possible to control the gas electronically to have a gas only adaptive cruise control. It's also possible for researchers to add mechanical actuators for the controls that cannot be electronically actuated. Have fun, be safe, and let's usher in the future of self driving cars together... (Github repo) (Interview)
The contest will inevitably take up a large area of the circuit since it will consist of three challenges and a triathlon type Grand Challenge. Challenge 1 will require teams to use an Unmanned Aerial Vehicle (UAV) to locate, track and land on a moving vehicle. This means that the UAV must be capable of locating the moving vehicle, navigating the path and speed of the vehicle and land on the target location while the vehicle is in motion.
Optelos Launches Turnkey Solution for Drone Workflow, Solving Key Challenges Holding Back Drone Rrograms
Optelos turnkey approach is setting the standard for securely managing drone work activities and data content.
CES 2017 in Las Vegas Brings AutonmouStuff and NEXCOM Together to Innovate Automated Driving Solutions
Glenn McDonald for Seeker: Want to know what drones of the future will look like? So does David Lentink, editor of Interface Focus, a journal that, as its title suggests, looks at the interface of different scientific disciplines. Each issue zeroes in on a particular intersection of physical sciences and life sciences and invites the world's top scholars to publish their latest work. The latest issue of Interface Focus brings together biologists and engineers to discuss a topic that's relatively straightforward and, well, pretty empirically cool: "It's completely focused on how animals fly and how that can help us build flying robots," said Lentink, assistant professor of mechanical engineering at Stanford. Can't argue with that. The new issue features 18 newly published papers on various ways that engineers are borrowing ideas from nature to make the next generation of drones and aerial robots. Several of the papers detail prototype drones that have already been built and tested. Cont'd...
From Alexander Shtuchkin: Code & schematics for position tracking sensor using HTC Vive's Lighthouse system and a Teensy board. General purpose indoor positioning sensor, good for robots, drones, etc. 3d position accuracy: currently ~10mm; less than 2mm possible with additional work. Update frequency: 30 Hz Output formats: Text; Mavlink ATT_POS_MOCAP via serial; Ublox GPS emulation (in works) HTC Vive Station visibility requirements: full top hemisphere from sensor. Both stations need to be visible. Positioning volume: same as HTC Vive, approx up to 4x4x3 meters. Cost: ~$10 + Teensy 3.2 ($20) (+ Lighthouse stations (2x $135)) Skills to build: Low complexity soldering; Embedded C++ recommended for integration to your project. (Github page)
The insect drone takes on the functions of larger UAVs, but reduces the larger drones down into a miniature undetectable device.
Under the FAA Pathfinder Program, PrecisionHawk's Phase 2 research indicates technology assist is critical for BVLOS operations.
Agreement brings additional, complete UAV solution packages to RDO Equipment Co. Customers
Written by AZoRobotics: Most robots achieve grasping and tactile sensing through motorized means, which can be excessively bulky and rigid. A Cornell group has devised a way for a soft robot to feel its surroundings internally, in much the same way humans do. A group led by Robert Shepherd, assistant professor of mechanical and aerospace engineering and principal investigator of Organic Robotics Lab, has published a paper describing how stretchable optical waveguides act as curvature, elongation and force sensors in a soft robotic hand. Doctoral student Huichan Zhao is lead author of “Optoelectronically Innervated Soft Prosthetic Hand via Stretchable Optical Waveguides,” which is featured in the debut edition of Science Robotics. The paper published Dec. 6; also contributing were doctoral students Kevin O’Brien and Shuo Li, both of Shepherd’s lab. Cont'd.. .
A fully autonomous automobile is able to decide whether it can safely enter an intersection. It is able to decide how to maneuver around other vehicles, people, and other moving objects.
Visteon's Silicon Valley Technical Center to Lead Development of Artificial Intelligence for Autonomous Vehicles
The recently opened facility in Santa Clara, California, will work closely with global Visteon tech centers to develop excellence in artificial intelligence software, advanced driver awareness systems (ADAS) and deep machine learning.
Application of integrated circuitry leads to a new approach to LiDAR sensors for the autonomous vehicle, 3D mapping, and drone industries
AutonomouStuff, the leader in enabling the future of transportation, will proudly unveil additions to their R&D Platform fleet and modular-based software applications at CES 2017. The fleet, which expands on their original Lincoln MKZ-based platform, now includes the Ford Fusion, Polaris Ranger and Polaris GEM.
Records 1621 to 1635 of 4114
LiDAR (Light Detection and Ranging) is one of the most reliable methods for parts sensing in factory automation today. SICK has made this technology affordable and easy to use! Click on the video link below to learn about the TiM1xx LiDAR sensor and how it provides: - Area scanning LiDAR technology in a standard sensor package - 200-degree field of view and 3 meter sensing range allows for 169 square feet of area scanning - Compact size and light weight enables ease of deployment in "non-standard" type applications, like end-of-arm robotic tooling - Wide area scanning without having to mount a transmitter and receiver makes installation much easier - IO Link for easy configuration