Richard Waters for FT.com: Toyota has hired the top robotics expert from the US defence department’s research arm and promised $50m in extra funding for artificial intelligence research, as it steps up the race between the world’s biggest carmakers to pioneer new forms of computer-assisted driving.
However, the Japanese carmaker maintained on Friday that completely driverless cars were still years away, and that AI and robotics would have a more complex effect on the relationship between humans and their vehicles than Google’s experiments with “robot cars” suggest.
Gill Pratt, who stepped down recently from the Defense Advanced Research Projects Agency (Darpa), will move to Silicon Valley to head Toyota’s robotics efforts, the company said. Darpa played a key role in stimulating interest in driverless cars with a competition in 2005 — the leader of the winning entry, Sebastian Thrun, who was then a professor at Stanford University, went on to found Google’s driverless car programme. Cont'd...
Animated retelling of a Buckminster Fuller story and Ephemeralization (doing more with less):
By Dian Schaffhauser for Campus Technology: A doctoral program at Michigan State University has begun experimenting with the use of robots to pull on-campus and off-campus students closer together in class. The Educational Psychology and Educational Technology (EPET) doctoral program focuses on the study of human learning and development and diverse technologies supporting learning and teaching. During a spring course in 2015 all but one student participated by being present in the form of an Apple iPad affixed to a swivel robot that was stationary; one student was on a robot that could move around the classroom.
As Christine Greenhow, the faculty member who led the seminar course, explained, the experiment was intended to expand beyond traditional Web presence of online students. "When you are using videoconferencing, it's very common to see all these different faces on the screen if you're here in the classroom and not really know where to look. It creates this distance between the speaker who's online and the speakers in the class," she said in a video about the project. "What if we could put online students in the classroom in a robot? How would their presence change?" Cont'd...
Bot-maker Savioke announces an open-source wrapper for Intel's RealSense Camera, adding another low-cost 3D sensing solution to the roboticist's toolkit.
The wrapper will allow developers to make use of the RealSense Camera, which enables robots to sense rich three-dimensional environments. "Intel RealSense Cameras bring great low-cost depth sensing to robotics, in a platform that is widely available and easy to integrate using ROS," says Steve Cousins, CEO of Savioke.
Until recently, bot makers looking to incorporate 3D sensing on the cheap have relied on a sensor made by Israeli company PrimeSense. But in late 2013 PrimeSense was acquired by Apple for $350M, an indication of just how much potential the Cupertino-based giant sees in 3D sensing technology.
Since the acquisition, robot developers have been eager for a flexible and cheap depth sensor. Intel, meanwhile, is making an aggressive move into the world of robotics, and the company was thrilled to offer ROS support for RealSense.
From the Instapainting Blog:
Over the past three weeks I’ve been working on a robotic painter to research the area of mechanical artwork reproduction and automated picture to painting creation for Instapainting.com and the print store e-commerce platform A Manufactory.
The initial prototype was built in about 3 weeks, and currently does mechnical reproductions. The AI painting mode which will paint a photograph will follow in the next post (putting some finishing touches on it)...
...The current prototype operates on 3 dimensions: X, Y, and a Z axis for pen pressure from the Wacom tablet. The artist can control the motion from a Wacom tablet and, for the most part, it’s lag-free. Every stroke is recorded so that it can be played back. You can see both the intitial painting and the playback in the video below... (full post)
By Dominic Basulto for the Washington Post: Researchers at MIT have just unveiled the ability to 3-D-print beautiful glass objects. While humanity has been forming, blowing and molding glass objects for more than 4,500 years, this is the first time that a 3-D printer has been used to process glass from a molten state to an annealed product.
Obviously, there are some purely aesthetic applications here, as in the potential for epic blown glass art. Think museum-worthy glass objects worthy of Dale Chihuly. In fact, the MIT team — a collaborative team of researchers that includes the MIT Media Lab’s Mediated Matter group, the MIT Glass Lab and MIT’s Mechanical Engineering Department — plan to display a few of their beautiful objects at an upcoming exhibition at the Cooper Hewitt, Smithsonian Design Museum in 2016.
But the applications go beyond just beautiful new designs that might be created via 3-D printers one day. As the MIT research team points out in a forthcoming paper for the journal 3-D Printing and Additive Manufacturing, “As designers learn to utilize this new freedom in glass manufacturing it is expected that a whole range of novel applications will be discovered.” That’s the real future potential of glass 3-D printing — the ability to create objects and applications that do not exist today. Cont'd...
Mike Elgan for Computer World: Consumer drone technology is barely taking off, and already a harsh public backlash is growing.
Your typical garden variety consumer drone is lightweight, battery operated, has four propellers and is controlled by a smartphone. Most have cameras and beam back live video, which can be recorded for posterity. Some have high-quality HD cameras on them, and from that high vantage point can take stunning photos and videos.
Drones are fun. They're exciting. They're accessible. But increasingly, they're becoming unacceptable.
I'm sensing a growing backlash, a kind of social media pitchfork mob against drones and drone fans. It's only a matter of time, and not much time, before it will be politically incorrect to express any kind of enthusiasm for drones in polite company. I fear that many are about to embark on an "everybody knows drones are bad" mentality that will suppress the nascent industry and spoil this innovative and exhilarating technology.
Here's what's driving the coming backlash: Cont'd...
You know how the stuntmen make fast cars drift in action movies? Have you ever wanted to make a remote-controlled toy car drift like that? Of course you have. If there ever were awards for endeavors that sound silly, but is actually technically interesting, then the folks at MIT’s Aerospace Controls Lab would surely be nominated.
Unmanned systems are rarely fully autonomous. Instead, researchers are pursuing “sliding” autonomy, i.e. an operator retains control, while some behaviors are made autonomous. Aerospace Controls Lab decided to teach a remote-control toy car how to autonomously drift.
They started by running their learning algorithm through simulations. Information from these simulations was transferred to performance modifiers. When the car was run through its drifting actions in reality, the algorithm was constantly modified. The result is a car that can maintain drifting in a full circle even when salt is added to the floor, or another vehicle interferes with it.
Boston Dynamics have developed the "Atlas" robot a highly mobility, humanoid robot designed to negotiate outdoor, rough terrain. Here is a video showing "Atlas" courtesy euronews.
MIT researchers have designed a human-machine interface that allows an exoskeleton-wearing human operator to control the movements and balance of a bipedal robot.
The technology could allow robots to be deployed to a disaster site, where the robot would explore the area, guided by a human operator from a remote location.
"We'd eventually have someone wearing a full-body suit and goggles, so he can feel and see everything the robot does, and vice versa," said PhD student Joao Ramos of Massachusetts Institute of Technology's Department of Mechanical Engineering.
"We plan to have the robot walk as a quadruped, then stand up on two feet to do difficult manipulation tasks such as open a door or clear an obstacle," Ramos said. Cont'd...
Y Combinator-backed Auro Robotics is currently testing their driverless shuttle system at several universities, and is actually beginning to deploy shuttles on the campus of Santa Clara University.
The company is also planning to expand to other markets like amusement parks, retirement communities and small islands, with some projects in those spaces already set to take off “in the later part of this year.”
Auro has chosen to focus on these small, contained environments largely because they are controlled by private corporations, and thus are not subject to the heavy government regulation that Google and other companies are stuck behind with their driverless cars... (full article)
From Auro Robotics:
How does it work
Auro Prime uses latest technology to ensure safe navigation even on busy roads. The vehicle is equipped with Lasers, camera, Radar and GPS providing it complete 360 degree vision under all environment situations.
The shuttles relies on a prior 3D map of the environment, which is created once in the lifetime at the beginning. In all subsequent runs, it uses this 3D map to localise itself and interpret road topography.
Passengers can input their destinations through a simple to use touch screen mounted on the vehicle, or through their mobile app. The underlying software automatically figures out the optimum high-level route to reach safely to the destination... (more info)
By Deborah M. Todd / Pittsburgh Post-Gazette: A new accelerator program and a $20 million venture fund started by Carnegie Mellon University and GE Ventures could brand Pittsburgh as the official home of the globe’s growing robotics industry.
CMU’s National Robotics Engineering Center and GE Ventures, the investment arm of Fairfield, Conn.-based General Electric, have teamed up to create The Robotics Hub, an early-stage startup accelerator program designed to draw the nation’s best advanced robotics firms to Pittsburgh and to keep those started here firmly in place.
The for-profit Robotics Hub will provide funding through newly created Coal Hill Ventures and access to equipment at CMU and the NREC to chosen companies by 2016, in addition to putting their creations on a fast track toward commercialization.
“The strategy that’s most important to GE is to really get behind startups and help them scale. A lot of companies can come with the money, but what we bring is the ability to scale and the opportunity to commercialize quite quickly, said Alex Tepper, GE Ventures managing director. Cont'd...
GreyOrange, a robotics firm that is in the business of automating warehouses, has raised $30 million (Rs 191.6 crore) in a round led by Tiger Global Management, with participation from existing investors Blume Ventures.
The funding, which the company says is one of the largest for robotics company globally, will be used to invest in developing new products, expand internationally into Asia Pacific, Middle East and Europe. The company says it has a 90% market share of India's warehouse automation market and it powers over 180,000 square feet of warehouse.
"We are doubling our team size globally as we steer the company and our products beyond India and into international markets," said co-founder and CEO Samay Kohli, who founded the company with Akash Gupta in 2011.
The company has two products: The Sorter and the Butler. The former is a high-speed system that consolidates orders and routes parcels. By Diwali, the company will have installed sortation capacity of 3 million parcels per day.
The second product, the Butler, is an order-picking system that is tailored for high-volume, high-mix orders characteristic of e-commerce and omni-channel logistics fulfilment. Cont'd..
Engineers use the environment to give simple robotic grippers more dexterity.
Engineers at MIT have now hit upon a way to impart more dexterity to simple robotic grippers: using the environment as a helping hand. The team, led by Alberto Rodriguez, an assistant professor of mechanical engineering, and graduate student Nikhil Chavan-Dafle, has developed a model that predicts the force with which a robotic gripper needs to push against various fixtures in the environment in order to adjust its grasp on an object.
By Matt Beane for MIT Technology Review: I think perhaps there’s something else at work here. Beyond building robots to increase productivity and do dangerous, dehumanizing tasks, we have made the technology into a potent symbol of sweeping change in the labor market, increased inequality, and recently the displacement of workers. If we replace the word “robot” with “machine,” this has happened in cycles extending well back through the Industrial Revolution. Holders of capital invest in machinery to increase production because they get a better return, and then many people, including some journalists, academics, and workers cry foul, pointing to the machinery as destroying jobs. Amidst the uproar, eventually there are a few reports of people angrily breaking the machines.
Two years ago, I did an observational study of semiautonomous mobile delivery robots at three different hospitals. I went in looking for how using the robots changed the way work got done, but I found out that beyond increasing productivity through delivery work, the robots were kept around as a symbol of how progressive the hospitals were, and that when people who’d been doing similar delivery jobs at the hospitals quit, their positions weren’t filled. Cont'd...
Records 166 to 180 of 600