The ABC of RPA: What is robotics and automation in the office?

Robotics and automation in the office is actually much more subtle, and typically works behind the scenes.

Safety for Collaborative Robots: New ISO/TS 15066

Collaborative robots are the latest technology, and may provide just the solution you are looking for to improve your production process, but you must still make sure any risks associated with your robotic application are recognized and managed appropriately.

Symmetric Multiprocessing or Virtualization: Maximizing the Value and Power of a Soft-Control Architecture

For a truly simplified and streamlined architecture that is high-performing, scalable, efficient. and built for long-term value, an SMP-enabled Soft-Control Architecture is recommended.

Weiss Enters the Octagon: Unique Shaped Dial Plate Accomodates Robotic Inertia Performance and Ergonomic Efficiencies

By collaborating with WEISS on their intricate medical part subassembly system, Jerit Automation was able to achieve its next-generation octagonal design and performance goals.

​Forget self-driving cars: What about self-flying drones?

Tina Amirtha for Benelux:  In 2014, three software engineers decided to create a drone company in Wavre, Belgium, just outside Brussels. All were licensed pilots and trained in NATO security techniques. But rather than build drones themselves, they decided they would upgrade existing radio-controlled civilian drones with an ultra-secure software layer to allow the devices to fly autonomously. Their company, EagleEye Systems, would manufacture the onboard computer and design the software, while existing manufacturers would provide the drone body and sensors. Fast-forward to the end of March this year, when the company received a Section 333 exemption from the US Federal Aviation Administration to operate and sell its brand of autonomous drones in the US. The decision came amid expectations that the FAA will loosen its restrictions on legal drone operations and issue new rules to allow drones to fly above crowds.   Cont'd...

SUNSPRING by 32 Tesla K80 GPUs

From Ross Goodwin on Medium: To call the film above surreal would be a dramatic understatement. Watching it for the first time, I almost couldn’t believe what I was seeing—actors taking something without any objective meaning, and breathing semantic life into it with their emotion, inflection, and movement.  After further consideration, I realized that actors do this all the time. Take any obscure line of Shakespearean dialogue and consider that 99.5% of the audience who hears that line in 2016 would not understand its meaning if they read it in on paper. However, in a play, they do understand it based on its context and the actor’s delivery.  As Modern English speakers, when we watch Shakespeare, we rely on actors to imbue the dialogue with meaning. And that’s exactly what happened inSunspring, because the script itself has no objective meaning. On watching the film, many of my friends did not realize that the action descriptions as well as the dialogue were computer generated. After examining the output from the computer, the production team made an effort to choose only action descriptions that realistically could be filmed, although the sequences themselves remained bizarre and surreal... (medium article with technical details) Here is the stage direction that led to Middleditch’s character vomiting an eyeball early in the film: C (smiles) I don’t know anything about any of this. H (to Hauk, taking his eyes from his mouth) Then what? H2 There’s no answer.

Employing Drones to Solve Business' Most Complex Issues

A true system doesnt only take the technology into account, but also the processes and human aspects.

5D Robotics + Aerial MOB = Autonomy and Reliability

The marrying of Aerial MOBs robust operational experience and IP portfolio with 5Ds robust autonomy and behavioral technology really bridges many of the gaps for delivering valuable products to many industrial type clients, such as those in oil and gas, utilities, and construction among others.

Real-time behaviour synthesis for dynamic Hand-Manipulation

From Vikash Kumar at University of Washington: Dexterous hand manipulation is one of the most complex types of biological movement, and has proven very difficult to replicate in robots. The usual approaches to robotic control - following pre-defined trajectories or planning online with reduced models - are both inapplicable. Dexterous manipulation is so sensitive to small variations in contact force and object location that it seems to require online planning without any simplifications. Here we demonstrate for the first time online planning (or model-predictive control) with a full physics model of a humanoid hand, with 28 degrees of freedom and 48 pneumatic actuators. We augment the actuation space with motor synergies which speed up optimization without removing flexibility. Most of our results are in simulation, showing nonprehensile object manipulation as well as typing. In both cases the input to the system is a high level task description, while all details of the hand movement emerge online from fully automated numerical optimization. We also show preliminary results on a hardware platform we have developed "ADROIT" - a ShadowHand skeleton equipped with faster and more compliant actuation... (website)

Ingestible origami robot

MIT News via Larry Hardesty for RoboHub:  In experiments involving a simulation of the human esophagus and stomach, researchers at MIT, the University of Sheffield, and the Tokyo Institute of Technology have demonstrated a tiny origami robot that can unfold itself from a swallowed capsule and, steered by external magnetic fields, crawl across the stomach wall to remove a swallowed button battery or patch a wound. The new work, which the researchers are presenting this week at the International Conference on Robotics and Automation, builds on a long sequence of papers on origamirobots from the research group of Daniela Rus, the Andrew and Erna Viterbi Professor in MIT’s Department of Electrical Engineering and Computer Science.   Cont'd...

Artistic Style Transfer for Videos

From Manuel Ruder, Alexey Dosovitskiy, Thomas Brox of the University of Freiburg: In the past, manually re-drawing an image in a certain artistic style required a professional artist and a long time. Doing this for a video sequence single-handed was beyond imagination. Nowadays computers provide new possibilities. We present an approach that transfers the style from one image (for example, a painting) to a whole video sequence. We make use of recent advances in style transfer in still images and propose new initializations and loss functions applicable to videos. This allows us to generate consistent and stable stylized video sequences, even in cases with large motion and strong occlusion. We show that the proposed method clearly outperforms simpler baselines both qualitatively and quantitatively... (pdf paper)  

A 'pick-by-robot' solution using a perception-controlled logistic robot called TORU

Designed to navigate freely and dynamically amongst a human workforce, TORU operates between regular shelves, picking a wide range of objects.

Zero Zero Hover Camera drone uses face tracking tech to follow you

Lee Mathews for Geek:  Camera-toting drones that can follow a subject while they’re recording aren’t a new thing, but a company called Zero Zero is putting a very different spin on them. It’s all about how they track what’s being filmed. Zero Zero’s new Hover Camera doesn’t require you to wear a special wristband like AirDog. There’s no “pod” to stuff in your pocket like the one that comes with Lily, and it doesn’t rely on GPS either. Instead, the Hover Camera uses its “eyes” to follow along. Unlike some drones that use visual sensors to lock on to a moving subject, the Hover Camera uses them in conjunction with face and body recognition algorithms to ensure that it’s actually following the person you want it to follow. For now, it can only track the person you initially select. By the time the Hover Camera goes up for sale, however, Zero Zero says it will be able to scan the entire surrounding area for faces.   Cont'd...

Face2Face: Real-time Face Capture and Reenactment of RGB Videos

From Justus Thies, Michael Zollhöfer, Marc Stamminger, Christian Theobalt and Matthias Nießner: We present a novel approach for real-time facial reenactment of a monocular target video sequence (e.g., Youtube video). The source sequence is also a monocular video stream, captured live with a commodity webcam. Our goal is to animate the facial expressions of the target video by a source actor and re-render the manipulated output video in a photo-realistic fashion. To this end, we first address the under-constrained problem of facial identity recovery from monocular video by non-rigid model-based bundling. At run time, we track facial expressions of both source and target video using a dense photometric consistency measure. Reenactment is then achieved by fast and efficient deformation transfer between source and target. The mouth interior that best matches the re-targeted expression is retrieved from the target sequence and warped to produce an accurate fit. Finally, we convincingly re-render the synthesized target face on top of the corresponding video stream such that it seamlessly blends with the real-world illumination... (full paper)  

As Robots Are Fast Becoming Our Co-workers, We Take A Look At When And Where This Alliance Began

While there are an increasing number of 'physical robots such as drones and self-driving delivery vehicles, software robots are becoming more and more common in the workplace, automating front and back office functions across a variety of industries and sectors.

Records 526 to 540 of 712

First | Previous | Next | Last

Featured Product

Denso Robotics - Newest 6-axis VMB Series offers longer arm reach and higher load capacity

Denso Robotics - Newest 6-axis VMB Series offers longer arm reach and higher load capacity

The new VMB series represent some of the newest members to our 6-axis family of robots. These high-performance, versatile units offer a longer arm reach and a higher load capacity than traditional models which make VMB an excellent solution for palletizing, packaging, and material handling. New features include greater air piping, valve and signal line options, as well as new programming options with state-of-the-art functions using our new WINCAPS Plus software. VMB offers an IP67 protection grade along with meeting ISO Class 5, which makes them suitable for electric parts, food manufacturing processes, and pharmaceutical and medical devices. With the addition of the new VMB large robots, all manufacturing processes can now be automated by DENSO Robotics.