Sharon Gaudin for ComputerWorld: In just five years, intelligent systems and robots may have taken up to 6% of U.S. jobs, according to Forrester Research in a report released this week.
As artificial intelligence (A.I.) advances to better understand human behavior and make decisions on its own in complicated situations, it will enable smart software and robots to take on increasingly challenging jobs. That means robotics should be able to take over some jobs traditionally held by humans by 2021.
For instance, Forrester predicts that smart systems like autonomous robots, digital assistants, A.I. software and chatbots will take over customer service rep jobs and eventually even serve as truck and taxi drivers. Cont'd...
Joshua Swingle for Android Headlines: LG is certainly no stranger when it comes to robotics and smart appliances, but until now, such products have had limited use. With the company’s latest announcement, this will change. The electronics giant has confirmed that it’s currently investing a lot of resources into robots in the hope of capitalizing on advanced AI, which could eventually be implemented into products that combine hardware with artificial intelligence in order to work with smart home appliances as well as to develop machines that could perform everyday tasks. “We will prepare for the future by aggressively investing in smart home, robots and key components and strengthen the home appliances business’s capabilities,” said Jo Seung-jin, head of LG’s appliances business.
As of now, there’s no time frame for when we’ll see the results of these investments on the shelves, but LG already has plans for products that will work with air conditioners and washing machines, though combining AI with self-driving cars is also something the company is researching. Although such plans aren’t exactly detailed, the investment does hint at a change in the way LG is treating robotics. Up until now, the company has only experimented with products of limited use, but the new change in focus hints at robotics becoming one of LG’s main focus points, meaning that, for consumers, having a robot in their homes could become the norm. Currently, LG has not confirmed the amount of money it plans to invest in robotics. Cont'd...
SHIVALI BEST FOR MAILONLINE: While Sony is currently one of the leading producers of smartphones, cameras and home entertainment systems, the company may soon be heading into the realm of robotics and AI. On Thursday, Kazuo Hirai, CEO of the Tokyo-based company, took to the stage at the IFA electronics show in Berlin to discuss the firm's newest products.He said that Sony was keen to explore new areas of technology, and that artificial intelligence and robotics were part of that.
The move towards robotics and AI is part of Sony's 'last one inch' mantra, that refers to getting products close to consumers. Mr Hirai said: 'I think the combination of 'the last one inch' - things that you hold in your hand to access or upload information, entertainment and so on - combined with AI and robotics is the area that is going to be a future growth area in a big way for Sony. Cont'd...
Julia Alexander for Polygon: With HTC Vive and Oculus Rift headsets, the first wave of mainstream, consumer VR has officially arrived, and with it, comes the question of how to constantly better the experience for those using it.
As it stands right now, those who want to use devices like the Vive or Rift must do so with controllers; the Rift uses an Xbox One controller while the Vive comes with its own dedicated peripheral. Both are functional and serve their purpose, but they come with certain limitations when trying to achieve the ideal VR experience.
Now, Dexmo Robotics has unveiled what it thinks will solve some of those frustrations: a mechanical exoskeleton glove that can be paired with VR headsets. The glove, which can be seen in the video above, provides 11 degrees of freedom for movement, and the company touts the fact that each finger comes with a pressured sensor. Essentially, if you’re playing a first-person shooter, you’ll be able to feel the in-game gun's trigger bring squeezed as well as the recoil. Full Article:
Caitlin Ju for The Stanford Daily: Stanford researchers in the Computational Vision and Geometry Lab have designed an autonomously navigating robot prototype that they say understands implicit social conventions and human behavior. Named “Jackrabbot” after the swift but cautious jackrabbit, the visually intelligent and socially amicable robot is able to maneuver crowds and pedestrian spaces.
A white ball on wheels, the Jackrabbot is built on a Segway system and contains a computing unit and multiple sensors that acquire a 3-D structure of the surrounding environment. 360-degree cameras and GPS also enhance the robot’s navigation and detection capabilities.
To interact smoothly in public settings, the robot has to know how to avoid someone in a natural way, how to yield the right-of-way and how to leave personal space, among other skills. Cont'd...
ABIGAIL BEALL FOR MAILONLINE: Many people spend their childhood peering up into the vast expanse of the sky, dreaming of growing up to become an astronaut. But these dreams could be dashed as the idea of people venturing into space will one day become a distant memory, according to a report published today. Robots will eventually have enough capabilities to replace humans and other animals on space missions, experts have said. Many missions involving humans in space are dangerous and expensive. But for years robots have been sent to places humans could not venture, like the rovers venturing to the edges of our solar system. According to European Space Agency (Esa) Astronaut Roberto Vittori, who launched a paper on space robotics and autonomous systems, robots can help carry out these dangerous missions. Cont'd...
Yuri Kageyama for News Factor: The U.S. robotics expert tapped to head Toyota's Silicon Valley research company says the $1 billion investment by the giant Japanese automaker will start showing results within five years.
Gill Pratt [pictured above] told reporters that the Toyota Research Institute is also looking ahead into the distant future when there will be cars that anyone, including children and the elderly, can ride in on their own, as well as robots that help out in homes.
Pratt, a former program manager at the U.S. military's Defense Advanced Research Projects Agency, joined Toyota Motor Corp. first as a technical adviser when it set up its artificial intelligence research effort at Stanford University and MIT.
He said safety features will be the first types of AI applications to appear in Toyota vehicles. Such features are already offered on some models now being sold, such as sensors that help cars brake or warn drivers before a possible crash, and cars that drive themselves automatically into parking spaces or on certain roads.
"I expect something to come out during those five years," Pratt told reporters recently at Toyota's Tokyo office of the timeframe seen for the investment. Cont'd...
From MIT News: Video-trained system from MIT’s Computer Science and Artificial Intelligence Lab could help robots understand how objects interact with the world. Researchers from MIT’s Computer Science and Artificial Intelligence Laboratory (CSAIL) have demonstrated an algorithm that has effectively learned how to predict sound: When shown a silent video clip of an object being hit, the algorithm can produce a sound for the hit that is realistic enough to fool human viewers.
This “Turing Test for sound” represents much more than just a clever computer trick: Researchers envision future versions of similar algorithms being used to automatically produce sound effects for movies and TV shows, as well as to help robots better understand objects’ properties... (full article)(full paper)
Graham Templeton for ExtremeTech: Google’s artificial intelligence researchers are starting to have to code around their own code, writing patches that limit a robot’s abilities so that it continues to develop down the path desired by the researchers — not by the robot itself. It’s the beginning of a long-term trend in robotics and AI in general: once we’ve put in all this work to increase the insight of an artificial intelligence, how can we make sure that insight will only be applied in the ways we would like?
That’s why researchers from Google’s DeepMind and the Future of Humanity Institute have published a paper outlining a software “killswitch” they claim can stop those instances of learning that could make an AI less useful — or, in the future, less safe. It’s really less a killswitch than a blind spot, removing from the AI the ability to learn the wrong lessons. Cont'd...
Mary-Ann Russon for International Business Times: When former Boston Dynamics employees released video of humanoid robot Atlas – walking unassisted over difficult terrain, such as rocks and snow – Google was reportedly displeased; despite the research receiving high praise from roboticists while wowing the public.
And the real reason Google is selling off Boston Dynamics is, by and large, due to insiders telling Tech Insider that the robotics firm was unwilling to fall in line with the internet giant's vision of a consumer robot for the home.
Google reportedly envisioned the firm as one of nine in a division called Replicant. Initially, under the guidance of Android co-founder Andy Rubin, the firms would continue with existing research and Google would see what ideas and innovations they came up with. Cont'd...
Records 1 to 15 of 46