Optical Tweezers' Control Micro-Robots

The development of light-driven 'micro-robots' that can autonomously investigate and manipulate the nano-scale environment in a microscope comes a step closer, thanks to new research from the University of Bristol.

Such devices could be used for high-resolution imaging, allowing the investigation of delicate biological samples such as cells in new ways.

Dr David Phillips, Professor Mervyn Miles and Dr Stephen Simpson of Bristol's School of Physics, and colleagues, aim to develop such micro-robots and control them using a technology known as 'optical tweezers'. In a paper published today in Nature Photonics, they investigate how optical tweezers can be used to manipulate nanofabricated structures to generate high-resolution images.

Optical tweezers use light to move microscopic objects such as individual cells or particles 1,000 times smaller than the width of human hair. When light reflects from a surface, or bends as it travels into a transparent material, it exerts a force on the object with which it has interacted. This force is very small, so we do not normally notice it in everyday life. But when the objects are themselves small, such as microscopic particles, then the forces that light exerts on them can be enough to push and pull them around.

This is the basis of optical tweezing technology: by focusing a laser towards particles in a microscope, scientists can use the light to pick them up, hold them still in one place, or move them around.

The Bristol team fabricated a set of needle-like microscopic particle 'probes', and used optical tweezers to pick them up and scan them along the side of a surface inside a microscope sample.

By monitoring changes in the position of the needle as it glided over nano-scale bumps in the surface with a high-speed video camera, they were able to build up an image of the surface that was impossible to see from the microscope image alone.

They also showed how the precise shape of the needle structure plays a fundamental role in how it behaves in the optical tweezers. They tailored the shape of the needle so that the forces exerted on the surface remained constant throughout the scan without the need for any computer controlled feedback - in essence this stopped them 'pressing too hard', making this device promising for investigating delicate biological samples.

Dr Phillips said: "This work paves the way towards the development of light-driven micro-robotics by providing a set of design rules for how complicated micro-structures will behave in light fields, and using them to design a new scanning probe imaging system that can operate inside an enclosed microfluidic chamber."


Featured Product

Universal Robots - Collaborative Robot Solutions

Universal Robots - Collaborative Robot Solutions

Universal Robots is a result of many years of intensive research in robotics. The product portfolio includes the UR5 and UR10 models that handle payloads of up to 11.3 lbs. and 22.6 lbs. respectively. The six-axis robot arms weigh as little as 40 lbs. with reach capabilities of up to 51 inches. Repeatability of +/- .004" allows quick precision handling of even microscopically small parts. After initial risk assessment, the collaborative Universal Robots can operate alongside human operators without cumbersome and expensive safety guarding. This makes it simple and easy to move the light-weight robot around the production, addressing the needs of agile manufacturing even within small- and medium sized companies regarding automation as costly and complex. If the robots come into contact with an employee, the built-in force control limits the forces at contact, adhering to the current safety requirements on force and torque limitations. Intuitively programmed by non-technical users, the robot arms go from box to operation in less than an hour, and typically pay for themselves within 195 days. Since the first UR robot entered the market in 2009, the company has seen substantial growth with the robotic arms now being sold in more than 50 countries worldwide.