Mary Jo Foley for All About Microsoft: In the early 2000s, Microsoft was all-in on robotics. By the middle of that decade, the company seemingly had all but abandoned the robotics space.
But this may be the year that Microsoft may be ready to get back into robotics, on multiple fronts.
When Microsoft founder Bill Gates was still involved in the day-to-day operations of the company, robotics was slated to be one of Microsoft's next big things. Microsoft built a programming model and framework for developers working on anything from Lego robots to industrial-scale robots. However, that product, "Microsoft Robotics Studio," never really went beyond the academic and hobbyist communities and the company's ambitions in this space withered.
Cut to 2017. These days, the home for a good chunk of the Microsoft current robotics work is apparently in Microsoft Research (MSR) -- specifically in the AI + Research (AI+R) Group under executive vice president Harry Shum. (I say "apparently" here because Microsoft officials declined to answer any of my questions on the company's robotics initiatives.) Shum is known for his work in computer vision and graphics and has a Ph.D. in robotics from Carnegie Mellon. Cont'd...
Jared Newman for PCWorld: At the 2015 Build conference, Microsoft tried to prove that HoloLens is more than just a neat gimmick.
The company showed off several new demos for its “mixed reality” headset, which can map digital imagery onto the user’s physical surroundings. While previous demos had focused on fun ideas like a virtual Mars walk and a living room-sized version of Minecraft, the Build presentation emphasized real-world applications for businesses and education.
For instance, Microsoft showed how architects could use HoloLens to interact with 3D models, laid out virtually in front of them on a table. They might also be able to examine aspects of a building site at full scale, with virtual beams and walls rendered before their eyes.
Not all the presentations were so serious. Microsoft also showed off an actual robot whose controls appeared in the virtual space above the robot’s head. Users could then create a movement pattern for the robot by tapping on the ground. Another demo showed how users could create their own personal screens that followed them around in real space.
The ST Robotics Workspace Sentry robot and area safety system are based on a small module that sends an infrared beam across the workspace. If the user puts his hand (or any other object) in the workspace, the robot stops using programmable emergency deceleration. Each module has three beams at different angles and the distance a beam reaches is adjustable. Two or more modules can be daisy chained to watch a wider area. "A robot that is tuned to stop on impact may not be safe. Robots where the trip torque can be set at low thresholds are too slow for any practical industrial application. The best system is where the work area has proximity detectors so the robot stops before impact and that is the approach ST Robotics has taken," states President and CEO of ST Robotics David Sands.