From Raspberry Pi Foundation:
The compute module contains the guts of a Raspberry Pi (the BCM2835 processor and 512Mbyte of RAM) as well as a 4Gbyte eMMC Flash device (which is the equivalent of the SD card in the Pi). This is all integrated on to a small 67.6x30mm board which fits into a standard DDR2 SODIMM connector (the same type of connector as used for laptop memory*). The Flash memory is connected directly to the processor on the board, but the remaining processor interfaces are available to the user via the connector pins. You get the full flexibility of the BCM2835 SoC (which means that many more GPIOs and interfaces are available as compared to the Raspberry Pi), and designing the module into a custom system should be relatively straightforward as we’ve put all the tricky bits onto the module itself.
So what you are seeing here is a Raspberry Pi shrunk down to fit on a SODIMM with onboard memory, whose connectors you can customise for your own needs.
The Compute Module is primarily designed for those who are going to create their own PCB. However, we are also launching something called the Compute Module IO Board to help designers get started.
MinnowBoard MAX is another open hardware embedded board we've developed to serve the needs of both the professional developer and hacker/maker community. Based on Intel's new Atom Bay Trail SoC platform, it offers a new generation of performance and features, but remains petite in size and cost.
Our entry-level SKU will be $99 MSRP, with additional board configuration options to be made available. All models will include 64-bit processors, USB 3.0, and Intel HD graphics with open source accelerated drivers for Linux, to just name a few of the new features!
- $99 MSRP: E3815 (single-core, 1.46 GHz), 1GB
- $129 MSRP: E3825 (dual-core, 1.33 GHz), 2GB
- HDMI (micro HDMI connector)
- 1 – Micro SD SDIO
- 1 – SATA2 3Gb/sec
- 1 – USB 3.0 (host)
- 1 – USB 2.0 (host)
- 1 – Serial debug via FTDI cable (sold separately)
- 10/100/1000 Ethernet
The low-speed expansion port is a 2×13 (26-pin) male 0.1″ pin header.
- SPI, I2C, I2S Audio, 2x UARTs (TTL-level), 8x GPIO (2x supporting PWM), +5V, GND
The high-speed expansion port is a 60-pin, high-density connector.
- 1x PCIe Gen 2.0 Lane, 1x SATA2 3Gb/sec, 1x USB 2.0 host, I2C, GPIO, JTAG, +5V, GND
From Evolving AI Lab:
Here we evolve the bodies of soft robots made of multiple materials (muscle, bone, & support tissue) to move quickly. Evolution produces a diverse array of fun, wacky, interesting, but ultimately functional soft robots. Enjoy! (full paper)
From John Goatstream's Vimeo Videos:
We present a muscle-based control method for simulated bipeds in which both the muscle routing and control parameters are optimized. This yields a generic locomotion control method that supports a variety of bipedal creatures. All actuation forces are the result of 3D simulated muscles, and a model of neural delay is included for all feedback paths. As a result, our controllers generate torque patterns that incorporate biomechanical constraints. The synthesized controllers find different gaits based on target speed, can cope with uneven terrain and external perturbations, and can steer to target directions... (full paper) (follow up videos)
DARPA tasks four companies with designing new aircraft to revolutionize vertical takeoff and landing (VTOL) flight capabilities.
For generations, new designs for vertical takeoff and landing aircraft have remained unable to increase top speed without sacrificing range, efficiency or the ability to do useful work. DARPA’s VTOL Experimental Plane (VTOL X-Plane) program seeks to overcome these challenges through innovative cross-pollination between the fixed-wing and rotary-wing worlds, to enable radical improvements in vertical and cruise flight capabilities. In an important step toward that goal, DARPA has awarded prime contracts for Phase 1 of VTOL X-Plane to four companies:
- Aurora Flight Sciences Corporation
- The Boeing Company
- Karem Aircraft, Inc.
- Sikorsky Aircraft Corporation
“We were looking for different approaches to solve this extremely challenging problem, and we got them,” said Ashish Bagai, DARPA program manager. “The proposals we’ve chosen aim to create new technologies and incorporate existing ones that VTOL designs so far have not succeeded in developing. We’re eager to see if the performers can integrate their ideas into designs that could potentially achieve the performance goals we’ve set.”
VTOL X-Plane seeks to develop a technology demonstrator that could:
- Achieve a top sustained flight speed of 300 kt-400 kt
- Raise aircraft hover efficiency from 60 percent to at least 75 percent
- Present a more favorable cruise lift-to-drag ratio of at least 10, up from 5-6
- Carry a useful load of at least 40 percent of the vehicle’s projected gross weight of 10,000-12,000 pounds
All four winning companies proposed designs for unmanned vehicles, but the technologies that VTOL X-Plane intends to develop could apply equally well to manned aircraft. Another common element among the designs is that they all incorporate multipurpose technologies to varying degrees. Multipurpose technologies decrease the number of systems in a vehicle and its overall mechanical complexity. Multipurpose technologies also use space and weight more efficiently to improve performance and enable new and improved capabilities.
The next major milestone for VTOL X-Plane is scheduled for late 2015, when the four performers are required to submit preliminary designs. At that point, DARPA plans to review the designs to decide which to build as a technology demonstrator, with the goal of performing flight tests in the 2017-18 timeframe.
Soft robots — which don't just have soft exteriors but are also powered by fluid flowing through flexible channels — have become a sufficiently popular research topic that they now have their own journal, Soft Robotics. In the first issue of that journal, out this month, MIT researchers report the first self-contained autonomous soft robot, a "fish" that can execute an escape maneuver, convulsing its body to change direction, in just 100 milliseconds, or as quickly as a real fish can.
SimpleCV library for Python:
WHAT IS IT?
SimpleCV is an open source framework for building computer vision applications. With it, you get access to several high-powered computer vision libraries such as OpenCV – without having to first learn about bit depths, file formats, color spaces, buffer management, eigenvalues, or matrix versus bitmap storage. This is computer vision made easy... (cont'd)
The Agile Eye by Gosselin, Université Laval:
The Agile Eye is a 3-DOF 3-RRR spherical parallel manipulator developed for the rapid orientation of a camera. Its mechanical architecture leads to high velocities and accelerations.
The workspace of the Agile Eye is superior to that of the human eye. The miniature camera attached to the end-effector can be pointed in a cone of vision of 140° with ±30° in torsion. Moreover, due to its low inertia and its inherent stiffness, the mechanism can achieve angular velocities above 1000 °/sec and angular accelerations greater than 20000 °/sec2 which is beyond the capabilities of the human eye... (cont'd)
Hip Joint of the Bipedal Autonomous Robot LISA by Institute of Automatic Control:
The hip joint consists of three active rotational degrees of freedom whose rotational axes intersect in one point. In contrast to most hip joints of other bipedal robots LISA's hip joint are built as spherical parallel manipulators. A comparable cardanian joint would lead to a heavier weight and due to the functionality the masses of some engines would have to be accelerated by other engines during motion.
Due to the parallel manipulator all engines rest to the trunk. Only a coordinated interaction of all engines leads to a controlled motion of the thigh. This enables a design with a thigh of minimal and a trunk of maximal weight which is an advantageous weight distribution for bipedal walking. Because of the parallel manipulator structure forces applied on the thigh are distributed among all three engines and therefore the power of the engines adds up... (cont'd)
From Robot Launch:
Robot Launch 2014 is open to any robot startup pre/partial Series A. We're looking for startups with prototypes and business models. But we're also interested in any great robot startup idea.
What is a robot startup? Well, it could be a robot or an autonomous mobile manipulator. OR it could be an appliance or connected device. OR it could be a sensor or actuator or AI that makes robots better.
Prizes include money, mentoring, meetings and free legal and startup services from our supporting organizations, Silicon Valley Robotics, Indiegogo, WilmerHale, Grishin Robotics, Bosch Venture Capital, Lemnos Labs, Luxr, Robolution Capital, Lux Capital, OATV, Khosla Ventures, a showcase at Solid and media coverage by Robohub.
- Round One entries open Feb 20
- Round One entries close March 30 midnight (PST)
- Top 30 announced April 10
- Finalists announced April 30
- Final Showcase (tbc) May 20
You can enter here.
What is it?
Our current prototype is a 5” phone containing customized hardware and software designed to track the full 3D motion of the device, while simultaneously creating a map of the environment. These sensors allow the phone to make over a quarter million 3D measurements every second, updating its position and orientation in real-time, combining that data into a single 3D model of the space around you.
It runs Android and includes development APIs to provide position, orientation, and depth data to standard Android applications written in Java, C/C++, as well as the Unity Game Engine. These early prototypes, algorithms, and APIs are still in active development. So, these experimental devices are intended only for the adventurous and are not a final shipping product.
How do I get one?
We’re looking for professional developers with dreams of creating more than a touch-screen app. These devices were built with the unique ability to sense 3D motion and geometry. We want partners who will push the technology forward and build great user experiences on top of this platform.
Currently, we have 200 prototype dev kits. We have allocated some of these devices for projects in the areas of indoor navigation/mapping, single/multiplayer games that use physical space, and new algorithms for processing sensor data. We have also set aside units for applications we haven’t thought of yet. Tell us what you would build. Be creative. Be specific. Be bold.
We expect to distribute all of our available units by March 14th, 2014... cont'd
From KUKA's Youtube page:
On March 11th 2014 ping pong champion Timo Boll will challenge KUKA's Agilus robot to a ping-pong showdown.
Watch the final on March 11th 2014 at www.kuka-timoboll.com to find out the winner.
Timo Boll, German table tennis star, is the new brand ambassador for KUKA Robotics in China. The collaboration celebrates the inherent speed, precision, and flexibility of KUKA's industrial robots in tandem with Boll's electrifying and tactical prowess in competition.
To celebrate the new KUKA Robotics factory in Shanghai, the two giants will battle to the end on March 11th 2014. The 20,000 sq. meter space will produce the KR QUANTEC series robot as well as the KRC4 universal controller for the Asian market. As a market leader in China, KUKA aims to further develop automation in the country while providing a modern and employee-friendly working environment.
Eugénie von Tunzelmann:
Ever since reading Richard Dawkins' book 'The Blind Watchmaker' I'd wanted to try my hand at some evolutionary programming. The idea is to model natural selection inside the computer by generating procedural creatures and allowing them to vary and improve over time without user intervention.
The code to build and rig the robots was written in Python, as was the code to run the rigid body simulation, using the Open Dynamics Engine to drive the sim. I wrote an importer for Side Effects' Houdini to read in my robot simulations so I could render them out as pictures.
From Studio diip:
“Fish on Wheels” has been developed so fish can steer their tank into a certain direction. Our pet fish have always been limited to their water holding area known as “the fish tank”. In an attempt to liberate fish all over the world, the first self driving car for fish has been developed. This car moves by detecting the fish’s position with computer vision. Up until now driving vehicles has been limited to mankind only (excluding a handful of autonomous vehicles driven by computers), but now your pet fish can also put the pedal to the metal.
A prototype version of ”Fish on Wheels” has been constructed using a standard webcam, a battery powered Beagleboard and an Arduino controlled robot vehicle. Using the contrast of the fish with the bottom of the fish tank his position is determined and used to send commands to the Arduino for moving the car into that direction.
From the projects' kickstarter ($104,217 pledged of $5,000 goal):
uArm is a 4-axis parallel-mechanism robot arm, inspired by the ABB PalletPack industrial robot arm IRB460. ($185 for complete black kit and a gripper)
The basic design is Arduino-controlled with 4 degrees of freedom. Three servos on the base control the main movement of the arm and the mini servo on the top moves and rotates the object. The end-effector of the arm is always kept parallel to the ground.
Right now we have already developed a Windows application that allows the uArm to be controlled with keyboard or mouse.
With some basic controlling skills, you can use basically any input device to control it, for example, we have also used other remote controller to control the arm. With our imbedded inverse-kinematics algorithm, the uArm can be precisely controlled using coordinates.
We have also written an Arduino library specifically for controlling the uArm. So if you are familiar with Arduino, you can program it directly with Arduino IDE. By calling different functions, you can easily move uArm to your desired position without doing tons of hard math... cont'd
Clearpath Robotics has posted part one and two of their ongoing introductory to the ROS operating system:
Part One: Intro
Since we practically live in the Robot Operating System (ROS), we thought it was time to share some tips on how to get started with ROS. We’ll answer questions like where do I begin? How do I get started? What terminology should I brush up on? Keep an eye out for this ongoing ROS 101 blog series that will provide you with a top to bottom view of ROS that will focus on introducing basic concepts simply, cleanly and at a reasonable pace... cont'd
Part Two: Setup And Example
In the previous ROS 101 post, we provided a quick introduction to ROS to answer questions like What is ROS? and How do I get started? Now that you understand the basics, here’s how they can apply to a practical example. Follow along to see how we actually ‘do’ all of these things…. cont'd
Records 421 to 435 of 676