Earlier today IBM announced an experimental computer chip in which the computational elements and RAM are wired together much closer together than standard CPUs available today. IBM has made two prototypes of the new chip, which it calls a “neurosynaptic core.” Both are built on a standard semiconductor platform with 256 “neurons,” the chip’s computational components. RAM units on the chip act as synapses; one of the chips has 262,144 synapses, while the other has 65,536. Nature magazine has a run down of what is new about theses chips, what they propose to achieve here . To understand what makes this approach different you might want to read more about about the current CPU archecture model: Von Neumann, or stored-program architecture ( wikipedia ). The current model has an inherit bottleneck ( wikipedia ). Also here is IBM's official research blog post about the announcement and they plan to release further details at the IEEE Custom Integrated Circuits Conference on September 20 in San Jose, California.
Travis Deyle, at Hizook has a good run down of The Swarmanoid project. Its a co-op research funded by the European Commission to build and design a distributed robotic system. The swarmanoid that we intend to build will be comprised of numerous (about 60) autonomous robots of three types: eye-bots,hand-bots, and foot-bots.
George C. Devol, the inventor of the first robot arm: "Unimate", died on Thursday at his home in Wilton, Conn. He was 99. In May of this year, Mr. Devol was inducted into the National Inventors Hall of Fame. The citation states, in part, “George Devol’s patent for the first digitally operated programmable robotic arm represents the foundation of the modern robotics industry.” Here is his NY Times obituary and a reprint of a Robot Magazine article titled The Rise And Fall Of Unimation . It profiles the history of Unimation, the original company Devol and partner Joseph F. Engelberger formed to produce the Unimate.
Pipetel's Explorer is an un-tethered, modular, remotely controllable, self-powered inspection robot for the visual and non-destructive inspection of 6" and 8" natural gas un-piggable transmission pipelines. The most prominent reasons that render a pipeline un-piggable are flow rates that are lower than needed to propel an in-line inspection tool (pig); the presence of obstacles such as valves, mitered bends, back-to-back in and out-of-plane bends; and the cost and operational complications associated with installation of launching and receiving equipment. Explorer can also be used for distribution pipelines as a pre-inspection technology for other rehabilitation and repair techniques. The Explorer platform uses a Remote Field Eddy Current Sensor (RFEC) which is a non-destructive inspection sensor that uses low frequency alternating current to measure wall thickness for the entire pipe circumference. Explorer also incorporates two fisheye cameras at each end of the robot that provide high quality visual inspection for locating joints, tees and other pipeline appurtenances. As an in-line inspection tool, Explorer is launced operated and retrieved under live conditions and can negotiate diameter changes, bends and tees up to 90° as well as inclined and vertical sections of the pipeline network.
Modkit is an in-browser graphical programming environment for microcontrollers. Modkit allows you to program Arduino and Compatible hardware using simple graphical blocks and/or traditional text code. You start by configuring your hardware and then writing programs for that hardware configuration. With Modkit, you are able to configure your hardware graphically. You then snap together graphical code blocks to build programs, in a graphical programming language inside your browser. Finally using the downloadable widget you then send the finished code to your physical device. The Modkit MotoProto Shield for Arduino that makes it easy to connect up to 4 sensors and control two DC motors as well as a 16X2 character LCD. The sensor jacks accept 2.5mm cables and provide access to VCC, GND, and an analog input.
The Camerobot Systems is a robot system for the automated movement of film and studio cameras in live broadcasting and/or VR sets. The robot has 7-axis, a range of 4.0 meters in diameter and has a positional accuracy of +/- 0.05 mm. The camera is also capable of object and person tracking, collision avoidance, and movement syncing with virtual environments.
The newest episode of the robotics podcast, Flexible Elements, is up now. Host Per Sjöborg interviews Juan Gómez of Robotics Lab about the snake modular robots he is developing. In the Robotics Lab of his modular snakes have acquired new gates (styles of moving) that include rotating, rolling, turning, moving forward and moving backward. Everything is fully open source with full plans available for 3D printing. The audio is available here . With a full synopsis and links from the discussion on Sjöborg's website here.
Jaybridge Robotics in cooperation Kinze agricultural equipment manufacturer have unveiled the first autonomous grain cart system. The driverless system is fully controlled by advanced software and is capable of performing a complete workflow during the harvest process. This includes locating a moving harvester in the field, synchronizing with it, collecting its grain and delivering that grain to trucks near the field for transportation.
TurtleBot.eu , the official European store for Willow Garage's open source, Kinect enabled, TurtleBot have been working hard at getting the original US design compatible with EU standards. The conversion meant swapping swapping the original iRobot Create base for normal consumer Roombas as well as creating a new power board, and adapting trays. The design plans for EU compitable power board and adapter are now available here . The original Willow Garage design can be found here .
The Robotic Highway Safety Markers system was developed by Shane Farritor a Professor at University of Nebraska-Lincoln. The Robotic Safety Barrel (RSB) replaces the heavy base of a typical safety barrel with a mobile robot. The mobile robot can transport the safety barrel and robots can work in teams to provide traffic control. Independent, autonomous barrel motion has several advantages. First, the barrels can self-deploy, eliminating the dangerous task of manually placing barrels in busy traffic. To save costs, the robots work in teams. A more expensive "shepherd" robot with built-in Global Positioning System (GPS) navigation would position itself precisely, and then guide the placement of less expensive units, which measure out their positions based on wheel movement (a "dead reckoning" system). In tests, the robots were able to deploy themselves just about as well as humans could place them - their big wheels let them turn on a dime.
Per Sjöborg has a series of audio interviews with leading researchers and thinkers in the field self-reconfiguring modular robotics. On his website Flexibility Envelope he describes the field of self-reconfiguring modular robotics as the joining two elements: The first part is Modular robotics. This is a branch of robotics that aims to build complex systems with simple components. A bit like Lego,simple pieces are,by cooperating,capable of building complex objects. The Second part is Self-reorganization to make the units able to move among each other on their own accord and thus reconfigure themselves from one task to another without human intervention. This also allows the system created to be active and dynamic. His audio interviews can be found on here and are a great listen.
Birgus Latro has posted a write up and several videos looking at the Cubelets KT01 Construction Kit from Modular Robotics . This is the first production run of the Cublets and was limited to just 100 kits. Cublets are a modular robotics kit that consists of 20 magnetic blocks that can be snapped together to make an endless variety of robots with no programming and no wires. Each cubelet in the kit has different equipment on board and a different default behavior. There are Sense Blocks that act like our eyes and ears; they can sense light, temperature, and how far they are away from other objects.
IEEE Spectrum has an article by Dr Massimiliano Versace about a memristor-based approach to AI that consists of a chip that mimics how neurons process information. Researchers have suspected for decades that real artificial intelligence can't be done on traditional hardware, with its rigid adherence to Boolean logic and vast separation between memory and processing. But that knowledge was of little use until about two years ago, when HP built a new class of electronic device called a memristor. Before the memristor, it would have been impossible to create something with the form factor of a brain, the low power requirements, and the instantaneous internal communications. Turns out that those three things are key to making anything that resembles the brain and thus can be trained and coaxed to behave like a brain. In this case, form is function, or more accurately, function is hopeless without form. Basically, memristors are small enough, cheap enough, and efficient enough to fill the bill. Perhaps most important, they have key characteristics that resemble those of synapses. That's why they will be a crucial enabler of an artificial intelligence worthy of the term.
Kinect RGBDemo and the Nestk Library by Nicolas Burrus aim at providing a simple toolkit to start playing with Kinect data and develop standalone computer vision programs without the hassle of integrating existing libraries. The 0.6 release includes two new demos, an interactive program to calibrate multiple RGBD cameras, and a one shot 3D model acquisition of objects lying on a table based on PCL table top detector. Current features include: Grab kinect images and visualize / replay them Support for libfreenect and OpenNI/Nite backends Extract skeleton data / hand point position (Nite backend) Integration with OpenCV and PCL Multiple Kinect support and calibration Calibrate the camera to get point clouds in metric space (libfreenect) Export to meshlab/blender using .ply files Demo of 3D scene reconstruction using a freehand Kinect Demo of people detection and localization Demo of gesture recognition and skeleton tracking using Nite Demo of 3D model estimation of objects lying on a table (based on PCL table top object detector) Demo of multiple kinect calibration Linux, MacOSX and Windows support
The robotic inspector looks like nothing more than a small metallic cannonball. There are no propellers or rudders, or any obvious mechanism on its surface to power the robot through an underwater environment. A robot outfitted with external thrusters or propellers would easily lodge in a reactor’s intricate structures, including sensor probes, networks of pipes and joints. As the robot navigates a pipe system, the onboard camera takes images along the pipe’s interior. The original plan was to retrieve the robot and examine the images afterward. But now the MIT project director and his students are working to equip the robot with wireless underwater communications, using laser optics to transmit images in real time across distances of up to 100 meters.
Records 1171 to 1185 of 1204
STMicro Electronics and Mouser have partnered to create a repository of resources that showcases STMicro's extensive expertise in Industry 4.0 applications. Explore articles, blog posts, white papers, eBooks, and videos that will expand your knowledge and enhance your next design.