Robotics expert: Self-driving cars not ready for deployment

Joan Lowy for PHYS.org:  Self-driving cars are "absolutely not" ready for widespread deployment despite a rush to put them on the road, a robotics expert warned Tuesday. The cars aren't yet able to handle bad weather, including standing water, drizzling rain, sudden downpours and snow, Missy Cummings, director of Duke University's robotics program, told the Senate commerce committee. And they certainly aren't equipped to follow the directions of a police officer, she said. While enthusiastic about research into self-driving cars, "I am decidedly less optimistic about what I perceive to be a rush to field systems that are absolutely not ready for widespread deployment, and certainly not ready for humans to be completely taken out of the driver's seat," Cummings said. It's relatively easy for hackers to take control of the GPS navigation systems of self-driving cars, Cummings said. "It is feasible that people could commandeer self-driving vehicles ... to do their bidding, which could be malicious or simply just for the thrill of it," she said, adding that privacy of personal data is another concern.   Cont'd...

I Sing the Body Electric

Prosthetic technology is advancing rapidly, but not without sticking points.

Image Processing 101

Sher Minn Chong wrote a good introductory to image processing in Python: In this article, I will go through some basic building blocks of image processing, and share some code and approaches to basic how-tos. All code written is in Python and uses  OpenCV , a powerful image processing and computer vision library... ... When we’re trying to gather information about an image, we’ll first need to break it up into the features we are interested in. This is called segmentation. Image segmentation is the process representing an image in segments to make it more meaningful for easier to analyze3. Thresholding One of the simplest ways of segmenting an image isthresholding. The basic idea of thresholding is to replace each pixel in an image with a white pixel if a channel value of that pixel exceeds a certain threshold... ( full tutorial ) ( iPython Notebook )

18 months since the toolkit's release, soft robotics is flying

Gordon Hunt for SiliconRepublic:  Pioneered in Ireland by the likes of Dr Dónal Holland, with a plethora of departments in Harvard University in the US involved, the Soft Robotics Toolkit has gone on to foster significant interest in an area exploding into the mainstream. More than 76,000 people have engaged with the service since it was created, represented across 150 different countries, with the toolkit identified as having made one of the most significant contributions to the development of the nascent industry to date. While robotics engineering used to focus much more attention on creating the rigid, hard-bodied prototypes like Bender from Futurama, for example, lately there has been a push towards soft, malleable structures that take their inspiration from nature.   Cont'd...

Rethink Robotics Announces Major Distribution Partnerships in Germany

Sawyer is a smart, collaborative robot that can be trained by demonstration and change tasks quickly to fit the individual needs of the factory.

What The Heck Is An Unmanned Ground Engineering Vehicle?

The boom can be fitted with up to 80 different tools, including hydraulic hammers, cutting discs, clamps, and buckets.

Carnegie Mellon robotics selected for research projects totaling more than $11 million

Carnegie Mellon University's National Robotics Engineering Center (NREC) has been selected as a prime contractor or subcontractor on four major new federal research projects totaling more than $11 million over the next three years. The projects range from research on a wheel that can transform into a track to automated stress testing for critical software.  Herman Herman, NREC director, said the center has hired 10 new technical staff members in the past six months and anticipates hiring another five-to-10 staff members in the coming months to augment its existing staff of about 100.  "For the past 20 years, NREC has been an important national resource, combining unique technical skills and testing capabilities to solve problems that other groups can't," said Martial Hebert, director of CMU's Robotics Institute, which includes the NREC. "These new projects are a reminder that NREC continues to advance the art and science of robotics and that it remains a vital part of Carnegie Mellon's Robotics Institute."    Full Press Release:  

User Case Study: MapleSim Used To Speed Up Development Of High-fidelity Robotic Manipulator Models

Using MapleSim, engineers created multiple models of robotic manipulator in time previously required to create just one model.

Artificial Skin That Glows, Stretches Could Change Robotics?

By Brendan Byrne for ValueWalk:  Researchers at Cornell University have developed an electronic artificial skin that doesn’t mind being stretched to 500% its original size (cell phone), glows in the dark and can move a bit like a worm. In a paper published yesterday in the journal Science, a team of researchers showed off glowing electric skin that could be put to use in future wearables. While artificial skin that responds to commands has been done before, electronics embedded in the skin have generally broken when stretched. However, the team seems to have leaped over this hurdle by using hyperelastic, light-emitting capacitor (HLEC) technology. “It’s actually much, much, much more stretchable than human skin or octopus skin,” says Chris Larson, a doctoral candidate and researcher in Cornell’s Organic Robotics Lab. “In terms of texture, it’s actually more like a rubber band or a balloon.” While Larson freely admits that he doesn’t know much about cephalopods, the team was inspired by biology, specifically, the octopus beak with its ability to both move and stretch. “The researchers created a three-chamber robot from the material, with the newly developed ‘skin’ layers on top, and inflatable layers below that allow movement,” according to a release from the American Association for the Advancement of Science. “As the chambers expand linearly, the robot moves forward with a worm-like wiggle.”   Cont'd.. .

Postdoc's Trump Twitterbot Uses AI To Train Itself On Transcripts From Trump Speeches

From MIT: This week a postdoc at MIT’s Computer Science and Artificial Intelligence Lab (CSAIL) developed a Trump Twitterbot that Tweets out remarkably Trump-like statements, such as “I’m what ISIS doesn’t need.” The bot is based on an artificial-intelligence algorithm that is trained on just a few hours of transcripts of Trump’s victory speeches and debate performances... ... ( MIT article ) ( twitter feed )

Using Drones for Aerial Photography

If you can envision a shot, the drone can probably help achieve the photograph or video.

How This New Drone Can Track Your Every Move

Lisa Eadicicco  for Time:  Drones can already follow professional snowboarders as they speed down a slope or keep up with mountain bikers racing through rocky terrain. But drone-equipped athletes are usually required to keep their phone nearby, since the aerial devices often rely on handheld devices’ GPS signal to track a person’s location. DJI’s newest drone, the Phantom 4, claims to eliminate that hassle. The company says the Phantom 4’s new ActiveTrack feature uses the drone’s front-facing sensors to see and track a target. “Being able to learn about the object, as it squats, as it rotates, as it turns, is really complicated,” says Michael Perry, DJI’s director of strategic partnerships. “When you’re flying toward something, you have to make a decision to fly around it, fly above it, or stop. And to train the system to learn those different functions is also a big challenge.”   Cont'd...

Boomers at Work: Retirement vs. Working … It's Complicated

A lot of really smart people in technology took Confucius' advice, "Choose a job you love and you will never have to work a day in your life."

Would you buy meat from a robot butcher?

Greg Nichols for The Kernel:  In an era when hunks of cow and pig are packaged and distributed like Amazon Prime parcels, butchering has retained a surprising degree of its old-world craftsmanship. Workers armed with knives and hooks anachronistically slice flesh from bone the same way they have for hundreds of years. That’s because cutting meat—be it on an assembly line or in a niche shop in Santa Monica, California, or Brooklyn, New York—is a skill that requires exceptional dexterity, a good eye, and a honed tactile sense for texture and firmness. Industrial robots may be perfectly suited to welding chassis and painting cars, but they don’t have the touch to cut a succulent T-bone steak. That’s likely to change. JBS, one of the country’s largest meat processors, recently acquired a controlling share of Scott Technology, a New Zealand-based robotics firm. Now JBS is looking at ways to automate its facilities. Robots don’t sleep, don’t collect overtime, and don’t suffer the horrific repetitive stress injuries that plague meat workers. Meat is already packed using machines, and if engineers can figure out how to make automated systems that approximate the deft hands of a butcher, there’s little question giants like JBS, Cargill, and Tyson will replace many of their line workers with robots. In the next decade, adroit robots that can see, feel, and move like humans may finally kill off the butcher.   Cont'd...

Mercedes Boots Robots From the Production Line:

By Elisabeth Behrmann & Christoph Rauwald for Bloomberg Business:  “Robots can’t deal with the degree of individualization and the many variants that we have today,” Markus Schaefer, the German automaker’s head of production, said at its factory in Sindelfingen, the anchor of the Daimler AG unit’s global manufacturing network. “We’re saving money and safeguarding our future by employing more people.” Mercedes’s Sindelfingen plant, the manufacturer’s biggest, is an unlikely place to question the benefits of automation. While the factory makes elite models such as the GT sports car and the ultra-luxury S-Class Maybach sedan, the 101-year-old site is far from a boutique assembly shop. The complex processes 1,500 tons of steel a day and churns out more than 400,000 vehicles a year. That makes efficient, streamlined production as important at Sindelfingen as at any other automotive plant. But the age of individualization is forcing changes to the manufacturing methods that made cars and other goods accessible to the masses. The impetus for the shift is versatility. While robots are good at reliably and repeatedly performing defined tasks, they’re not good at adapting. That’s increasingly in demand amid a broader offering of models, each with more and more features.   Cont'd...

Records 511 to 525 of 1531

First | Previous | Next | Last

Featured Product

BitFlow Introduces 6th Generation Camera Link Frame Grabber: The Axion

BitFlow Introduces 6th Generation Camera Link Frame Grabber: The Axion

BitFlow has offered a Camera Link frame grabbers for almost 15 years. This latest offering, our 6th generation combines the power of CoaXPress with the requirements of Camera Link 2.0. Enabling a single or two camera system to operate at up to 850 MB/S per camera, the Axion-CL family is the best choice for CL frame grabber. Like the Cyton-CXP frame grabber, the Axion-CL leverages features such as the new StreamSync system, a highly optimized DMA engine, and expanded I/O capabilities that provide unprecedented flexibility in routing. There are two options available; Axion 1xE & Axion 2xE. The Axion 1xE is compatible with one base, medium, full or 80-bit camera offering PoCL, Power over Camera Link, on both connectors. The Axion 2xE is compatible with two base, medium, full or 80-bit cameras offering PoCL on both connectors for both cameras. The Axion-CL is a culmination of the continuous improvements and updates BitFlow has made to Camera Link frame grabbers.