From NASA at the Johnson Space Center:
From 3D Robotics: Today 3DR announced Solo, an all-in-one personal drone whose ease of use and powerful new features kick off a new aerial age. With computing power unmatched in the industry—from two integrated Linux computers, one on the craft and one in the controller—Solo delivers several world-first technologies, such as unfettered in-flight access to GoPro controls, including wireless HD streaming straight to mobile devices, and effortless computer-assisted Smart Shot flight features that allow even new pilots to capture professional aerial video from day one. Computer-assisted cinematography Solo’s intelligence unlocks powerful and one-of-a-kind computer-assisted Smart Shots (patent pending). Just set up the exact shot you want in real time, then tap “play” on the app and Solo will execute it with a level of precision and a soft touch that even seasoned cinema pilots can’t match. And with a list of Smart Shots to choose from, the perfect shot is always just a few taps away. Cable cam and Orbit allow you to create a known and safe flight path along a virtual track in space, freeing you to shift your focus to getting the shot you want; or simply hit “play” and let Solo fly itself while simultaneously working the camera for you, as smooth and even as an expert cameraman. Follow mode lets you go completely hands-free, while Solo keeps up with your every move. Solo also features a one-touch aerial Selfie for a dramatic and customizable establishing Smart Shot of you and your surroundings. It’s all in service of Solo’s guiding principle: Get the shot. Every time. Stream HD GoPro video to mobile devices or through HDMI Solo is the world’s first drone to wirelessly deliver HD video from your GoPro to your iOS or Android mobile device, at ranges of up to half a mile. Solo’s controller has HDMI output for live HD broadcast—to FPV goggles, high-quality field monitors, Jumbotrons at live events, even news vans—and with a staggeringly low video latency your live video is immediate and fluid for an “in-the-moment” feel. For instant social sharing of your aerial videos, you can even record the live video stream directly from the mobile app to your device’s camera roll... ( full details )
From DJI: More videos and complete specs... ( here )
From University of Tokyo: In our laboratory, the Lumipen system has been proposed to solve the time-geometric inconsistency caused by the delay when using dynamic objects. It consists of a projector and a high-speed optical axis controller with high-speed vision and mirrors, called Saccade Mirror ( 1ms Auto Pan-Tilt technology). Lumipen can provide projected images that are fixed on dynamic objects such as bouncing balls. However, the robustness of the tracking is sensitive to the simultaneous projection on the object, as well as the environmental lighting... ( full article )
MIT paper from Andrea Censi and Davide Scaramuzza: The agility of a robotic system is ultimately limited by the speed of its processing pipeline. The use of a Dynamic Vision Sensors (DVS), a sensor producing asynchronous events as luminance changes are perceived by its pixels, makes it possible to have a sensing pipeline of a theoretical latency of a few microseconds. However, several challenges must be overcome: a DVS does not provide the grayscale value but only changes in the luminance; and because the output is composed by a sequence of events, traditional frame-based visual odometry methods are not applicable. This paper presents the first visual odometry system based on a DVS plus a normal CMOS camera to provide the absolute brightness values. The two sources of data are automatically spatiotemporally calibrated from logs taken during normal operation. We design a visual odometry method that uses the DVS events to estimate the relative displacement since the previous CMOS frame by processing each event individually. Experiments show that the rotation can be estimated with surprising accuracy, while the translation can be estimated only very noisily, because it produces few events due to very small apparent motion ... ( full paper )
From Leap Motion's developer blog : V2 retains the speed and positional accuracy found in V1, but the software also now tracks the actual joints and bones inside each of the user’s fingers. This leads to some immediate benefits over V1: Finger and hand labels – every finger, hand, and joint now has anatomical labels like ‘pinky’, ‘left hand’, and ‘proximal phalanges’ Occlusion robustness – fingers are tracked even when they’re not seen by the controller, as might happen if you turned your hands completely vertically or intertwined the fingers of your left and right hands Massively improved resistance to ambient infrared light – sunlight, powerful halogens, etc. Much more granular data for developers about the user’s hands and fingers – 27 dimensions per hand, in addition to special parameters like grab/pinch APIs
Available now at Maker Shed for $499 : If you haven't checked out the amazing capabilities of the DARwIn-OP Deluxe Edition , you should! DARwIn-Mini is the younger, but no less amazing, sibling of this award winning robot. DARwIn-Mini is Dynamic Anthropomorphic Robot with Intelligence from Korea-based ROBOTIS kits, famed for their transformability and stunning humanoid designs. Custom controller based on the 32-bit ARM Cortex M3 3D files of the robot’s parts at Thingverse
From Parrot's official announcement today: Today, after 5 years of development, we are excited to introduce you Parrot Bebop Drone, a ultra-light drone with a full HD camera digitally stabilized on its 3-axis. Chris Anderson's and DIYDrones first look IEEE Spectrum coverage Press release pdf
From OpenTX: OpenTX is open source firmware for RC radio transmitters. The firmware is highly configurable and brings much more features than found in traditional radios... ( cont'd )
From Pattenstudio : Thumbles is an interactive tabletop system based on a group of tiny robots that users can grasp and manipulate. Each robot can represent anything from character in a video game to a molecule in a scientific visualization. The system combines the versatility of a graphical interface with the tactile advantages of physical controls.
From Gobot's homepage: Gobot is a framework and set of libraries in the Go programming language for robotics, physical computing, and the Internet of Things... ( cont'd )
From Unbounded Robotics: Beginning today, UBR-1 is available for purchase. To order your own state-of-the-art mobile manipulation platform please contact email@example.com. The majority of requests we have had to date have been for the UBR-1 pro model, so we have decided to focus on and ship only one model. The cost of the UBR-1 is $50,000 and will include the newest Hokuyo UST-20LX scanning laser. UBR-1 now offers a higher maximum speed, a state of the art laser scanner, more RAM, and a larger hard drive capacity. In addition to the UBR-1 we are happy to announce that we will also be offering a ROS Ready Computer Package to make setting up your robot even easier. This package includes a computer with Ubuntu and ROS pre-installed, and a wireless router pre-configured to connect your robot and computer. We are planning to start shipping robots to their destinations in late August... ( cont'd )
From SRI International: SRI is developing new technology to reliably control thousands of micro-robots for smart manufacturing of macro-scale products in compact, integrated systems... ( cont'd )
From Kåre Halvorsen project on the Lynxmotion forums: Sphere shaped hexapod that I plan to give the following features: Roll freely like a ball Have different sort of locomotion for moving in any direction Variable inner-body dimensions Transform from a sphere shape into a hexapod and vice versa Walk like a hexapod Project's summary on Robotee and original forum thread.
From Google Online Security Blog: Translating a street address to an exact location on a map is harder than it seems. To take on this challenge and make Google Maps even more useful, we’ve been working on a new system to help locate addresses even more accurately, using some of the technology from the Street View and reCAPTCHA teams. This technology finds and reads street numbers in Street View, and correlates those numbers with existing addresses to pinpoint their exact location on Google Maps. We show that this system is able to accurately detect and read difficult numbers in Street View with 90% accuracy. Turns out that this new algorithm can also be used to read CAPTCHA puzzles—we found that it can decipher the hardest distorted text puzzles from reCAPTCHA with over 99% accuracy... ( cont'd ) ( full technical paper )
Records 106 to 120 of 123
Our fully autonomous intelligent vehicles will help you to transform the way you move materials and route your workflows. Increase throughput, eliminate material flow errors, improve traceability, maximize flexibility and allow your employees to focus on higher level tasks. Unlike traditional AGV's, our mobile robotics navigate using the natural features of your facility and do not require expensive facility modifications or guidance. Our AIV's can adapt to changes in their environment and work freely and safely with your staff. Our mobile robots are intelligent enough to quickly learn their environment and then automatically find the optimal path to where they need to go. They also automatically make adjust for dynamic environments and can work together in fleets of up to 100 robots.