From University of Tokyo: ACHIRES is composed of high-speed vision and high-speed actuators to achieve instantaneous recognition and behavior. The similar technologies are used in our Janken (Rock Paper Scissors) Robot. High-speed vision detects the state of the biped robot including the timing of landing at 600 fps. The biped mechanism with the leg length of 14 cm is set to run in the sagittal plane. At present, the running velocity reaches 4.2 km/h. Simple control based on high-speed performance of sensory-motor system enables the biped robot to stably run without falling, unlike computationally expensive ZMP-based control which is commonly used for balance. The aerial posture is recovered to compensate for the deviation from the stable trajectory using high-speed visual feedback. We also address a task of somersaulting. While running, the robot takes a big swing with one foot and jumps. After takeoff, both legs are controlled to curl up for high-speed rotation in the air. ACHIRES is going to be improved to push the envelope while demonstrating various biped locomotion tasks... ( cont'd )
From IEEE Spectrum: Printable, self-folding robot created by Harvard and MIT researchers... ( IEEE Spectrum story ) ( full paper )
From hitchBOT's page: I am hitchBOT — a robot from Port Credit, Ontario. I am traveling from Halifax, Nova Scotia to Victoria, British Columbia this summer. As you may have guessed, robots cannot get driver’s licences yet, so I’ll be hitchhiking my entire way... ( cont'd )
Ino tools webpage : Ino is a command line toolkit for working with Arduino hardware It allows you to: Quickly create new projects Build a firmware from multiple source files and libraries Upload the firmware to a device Perform serial communication with a device (aka serial monitor) Ino may replace Arduino IDE UI if you prefer to work with command line and an editor of your choice or if you want to integrate Arduino build process to 3-rd party IDE. Ino is based on make to perform builds. However Makefiles are generated automatically and you’ll never see them if you don’t want to. Features Simple. No build scripts are necessary. Out-of-source builds. Directories with source files are not cluttered with intermediate object files. Support for *.ino and *.pde sketches as well as raw *.c and *.cpp . Support for Arduino Software versions 1.x as well as 0.x. Automatic dependency tracking. Referred libraries are automatically included in the build process. Changes in *.h files lead to recompilation of sources which include them. Pretty colorful output. Support for all boards that are supported by Arduino IDE. Fast. Discovered tool paths and other stuff is cached across runs. If nothing has changed, nothing is build. Flexible. Support for simple ini-style config files to setup machine-specific info like used Arduino model, Arduino distribution path, etc just once. ( Homepage )
From Tilman Griesel's posts on DIY Drones: Open source project to make FPV(first person view) with the raspberry pi easy to use for every one. OpenFPV is a project that uses latest technology to provide low latency, easy and well tested open source FPV flying. Based on single-board computers, HD cameras and IEEE 802.11. Current Features: Recording Web interface Low-Latency H264 Streaming (≈120ms) RESTful API Customizable Extendable Installer (in progress) Minimal bettery consumption Roadmap: Invite more people to join the development team Complete the installer More field tests with different setups Create desktop applications for mac/win with HUD support Add OculusVR support Release is not available yet but you can follow the progress on DIY Drones or the OpenFPV homepage .
Parallella Computer Specifications: The Parallella platform is an open source, energy efficient, high performance, credit-card sized computer based on the Epiphany multicore chips developed by Adapteva. This affordable platform is designed for developing and implementing high performance, parallel processing applications developed to take advantage of the on-board Epiphany chip. The Epiphany 16 or 64 core chips consists of a scalable array of simple RISC processors programmable in C/C++ connected together with a fast on chip network within a single shared memory architecture... ( cont'd ) A realtime raytracing example running on the 16-core Epiphany chip:
From Eben Upton, Raspberry Pi Founder: This isn’t a “Raspberry Pi 2″, but rather the final evolution of the original Raspberry Pi. Today, I’m very pleased to be able to announce the immediate availability, at $35 – it’s still the same price, of what we’re calling the Raspberry Pi Model B+. The Model B+ uses the same BCM2835 application processor as the Model B. It runs the same software, and still has 512MB RAM; but James and the team have made the following key improvements: More GPIO. The GPIO header has grown to 40 pins, while retaining the same pinout for the first 26 pins as the Model B. More USB. We now have 4 USB 2.0 ports, compared to 2 on the Model B, and better hotplug and overcurrent behaviour. Micro SD. The old friction-fit SD card socket has been replaced with a much nicer push-push micro SD version. Lower power consumption. By replacing linear regulators with switching ones we’ve reduced power consumption by between 0.5W and 1W. Better audio. The audio circuit incorporates a dedicated low-noise power supply. Neater form factor. We’ve aligned the USB connectors with the board edge, moved composite video onto the 3.5mm jack, and added four squarely-placed mounting holes... ( cont'd )
From Jie Qi's projects page: Shape memory alloys (SMAs) are metals that change shape when heated up. They are wonderful actuators in that they are light, silent and can be "turned on" by simply running current through. The shape that they change to can also be set, though this process is a bit more tricky. Flexinol is a particular brand of nitinol, which is an SMA made of nickel and titanium, and is pre-set to contract about 10% of its original length when heated. In my projects, I generally used the 0.006" to 0.01" diameter, High-Temp wires. Since Flexinol draws a lot of current (about 300mA for the diameters I used), you need a strong power supply like a wall supply or a good lithium-ion battery. I've used from 3.7V up to 6V (any more and my Flexinol wires would start overheating). To turn the Flexinol on, I would simply short the ends of the wire to the power. For digital control, I used a standard MOSFET circuit which is a digital switch that can be turned on and off using a microcontroller... ( cont'd )
Project Overview: BugJuggler will use a diesel engine to generate hydraulic pressure. An operator located in the robot’s head will be able to control its motions using a haptic feedback interface connected to high-speed servo valves. Hydraulic accumulators - essentially storage batteries for hydraulic fluid - will allow for the rapid movement required for the robot to juggle cars or other large, heavy objects. The first stage of the BugJuggler project will be construction of a working 8ft tall single arm proof-of-principle juggler able to toss and catch a 250lb mass... ( cont'd )
From Wired: Intel describes Jimmy as a research robot, but a less sophisticated version of the adorable droid will go on sale later this year for $1,600. The caveat is that you will have to 3D print your Jimmy. The 3D printing blueprints will be available without charge, but to construct the robot you will also need to purchase a kit from Intel that will contain all the parts of Jimmy that aren't printable, including motors and an Intel Edison processor.. ( cont'd )
From InMoov's homepage: Gael Langevin is a French modelmaker and sculptor. He works for the biggest brands since more than 25 years. InMoov is his personal project, it was initiated in January 2012 InMoov is the first Open Source 3D printed life-size robot. Replicable on any home 3D printer with a 12x12x12cm area, it is conceived as a development platform for Universities, Laboratories, Hobbyist, but first of all for Makers. It’s concept, based on sharing and community, gives him the honor to be reproduced for countless projects through out the world... ( cont'd )
From Doc-Ok.org: Video from a capture space consisting of one Oculus Rift head-mounted display and three Kinect 3D cameras set up in an equilateral triangle, with each Kinect approximately 2m from the center point. The resulting 3D video data is merged with a virtual 3D model of an office environment... (cont'd)
Hasso-Plattner-Institut : faBrickation is a new approach to rapid prototyping of functional objects, such as the body of a head-mounted display. The key idea is to save 3D printing time by automatically substituting sub-volumes with standard building blocks — in our case Lego bricks. When making the body for a head-mounted display, for example, getting the optical path right is paramount. Users thus mark the lens mounts as “high-resolution” to indicate that these should later be 3D printed. faBrickator then 3D prints these parts. It also generates instructions that show users how to create everything else from Lego bricks.
From Factory-in-a-Day's page : Small and medium-sized enterprises in Europe mostly refrain from using advanced robot technology. The EU-project Factory-in-a-Day aims to change this by developing a robotic system that can be set up and made operational in 24 hours and is flexible, leasable and cheap. The project has a budget of 11 million euros for four years, 7.9 million of which will be funded by the European Union as part of the FP7 programme ‘Factory of the Future’. The international consortium comprises 16 partners and the coordinating university is Delft University of Technology (TU Delft). The project will start on 8 October 2013 with a formal kick- off meeting in Delft. Within 24 hours The Factory-in-a-Day-project will provide a solution to these problems: a robot that can be set up and operational in 24 hours. SME companies can use the robot for a specific job and their staff can learn how to work closely together with the robot and thus optimize their production. “With the technological and organizational innovations of the Factory-in-a-Day project, we hope to fundamentally change the ways in which robots are used in the manufacturing world”, says project coordinator Martijn Wisse, Associate Professor at TU Delft. How does it work? What will such an installation day look like? First of all, before the robot is actually taken to the SME premises, a system integrator analyzes which steps in the process can be taken over by the robot. In most cases the repetitive work is done by the robot while the human worker carries out the more flexible, accurate tasks and deals with problem- solving. Customer-specific hardware-components are 3D-printed and installed on the grippers of the robot. The robot is then brought to the factory and set up, and any auxiliary components such as cameras are also set up in the unaltered production facilities. The robot will be connected to the machinery software through a brand-independent software system. After that, the robot is taught how to perform his set of tasks, for example how to grasp an object. Therefore, the operator will physically interact with the robot. A set of predefined skills will be available, rather like Apps for smart phones. Finally, the robot is operational and the human co-workers receive their training -- all in just 24 hours.
The consumer electronics show CES is this week so we are probably going to see a couple new 3D printers announced. MakerBot has been teasing a new version of their Thing-O-Matic and today 3D@Home announced their Cube printer. The printer will cost $1,299 and print standard .STL files to print out ABS plastic models. 3D@Home also plans to offer a print on demand service for larger models.
Records 556 to 570 of 570
Mobile & Service Robots - Featured Product
The new complete inertial navigation solution comes with GNSS/INS 3DMGQ7 sensor, 3DMRTK correction modem and real time SensorCloud RTK correction network. The sensor comes with Dual antenna GNSS, tactical grade IMU with centimeter-level accuracy with RTK. Low profile and light weight at 78 grams, this sensor is The sensor is optimized for size and weight in Unmanned Ground Vehicles, mobile robots and autonomous vehicles. The network interface 3DMRTK modem is seamlessly integrated into the 3DMGQ7 sensor supports industry-standard NMEA and RTCM 3.1 protocols. The network RTK support comes with cellular data plan. The SensorCloud RTK is a cloud-based RTK correction system with private encrypted data stream. Check out our G Series and C Series OEM products.