Two-dimensional, vision-based μN force sensor for microrobotics (original) (raw)
Related papers
Volume 6: ASME Power Transmission and Gearing Conference; 3rd International Conference on Micro- and Nanosystems; 11th International Conference on Advanced Vehicle and Tire Technologies, 2009
In this paper, we present three designs for a decoupled, two-dimensional, vision-based µN force sensor for microrobotic applications. There are currently no reliable, off-the-shelf, commercially-available force sensors to measure forces at this scale, that can be easily integrated into standard microrobotic test-beds. In our previous work, we presented a design consisting of a planar, elastic mechanism with known force-deflection characteristics. It was inspired by the designs of pre-existing MEMS suspension mechanisms. A CCD camera is used to track the deformation of the mechanism as it is used to manipulate objects in a micro/meso-scale robotic manipulation test-bed. By observing the displacements of select points in the mechanism, the manipulation forces can be estimated. Here, a building block approach for conceptual synthesis of compliant mechanisms methodology is used to design for decoupled displacements for the tracking points when the tip is subjected to forces in the XY-plane. By designing mechanisms with circular compliance and stiffness ellipses along with zero magnitude compliance and stiffness vectors, we are able to achieve our design requirements. Validation of this approach with macro-scale prototypes and recommendations for scaling the designs down for microrobotic applications are offered along with a sensitivity analysis of the final designs yielding insights for microfabricating such designs.
A two dimensional vision-based force sensor for microrobotic applications
Sensors and Actuators A: Physical, 2011
We present a two-dimensional, vision-based force sensor, capable of sensing micro-Newton level forces for use in microrobotic applications. Our design consists of a planar, elastic mechanism with known force-deflection characteristics. A CCD camera attached to an optical microscope is used to track the deformation of the mechanism as it is used to manipulate objects in a microrobotic test-bed. By observing the displacements of select points in the mechanism, the manipulation forces can be extracted in realtime to achieve force-guided manipulation of micro-scale objects. The modeling, design, microfabrication, calibration and experimental validation of the force sensor for use in a microassembly application are presented here along with preliminary results for next generation designs.
A Microforce-Sensing Mobile Microrobot for Automated Micromanipulation Tasks
IEEE Transactions on Automation Science and Engineering, 2019
This paper presents a microforce-sensing mobile microrobot (µFSMM) for use in automated micromanipulation tasks. The design consists of a planar vision-based microforce sensor end-effector, while the microrobot body is made of chemically etched nickel that is driven by an exterior magnetic field. With a known stiffness, the manipulation forces can be determined from observing the deformation of the end-effector through a camera attached to an optical microscope. After analyzing and calibrating the stiffness of a micromachined prototype, the mobility and in situ force-sensing capabilities are verified through real-time, closed loop, force controlled manipulation tests with automated path planning and navigation. The calibrated stiffness of the microforce sensor end-effectors fabricated is on the order of 10 −3 N/m. The online (real time) force-sensing resolution is approximately 1.5 µN. The sensing range is 0-20 µN along the two planar directions. In automated micromanipulation experiments with a microcomponent, the µFSMM utilizes realtime force control to apply a prescribed force of 6 µN to a desired location on a fixed microobject. Similarly, in another automated micromanipulation experiment, the µFSMM demonstrates the use of real-time force control to limit the manipulation forces experienced by the microobject to remain below a threshold of 12 µN. Note to Practitioners-This paper was motivated by recent interest single-cell biological micromanipulation tasks that seek to understand the role of environmental forces on the mechanics of cell development (mechanobiology) and the biological mechanisms that control such behavior (mechanotransduction). In addition, tissue engineering applications require the safe micromanipulation of single cells to desired locations in the workspace for growing tissue scaffolds. The µFSMM presented here can be easily inserted into existing biological testbeds to use for these aforementioned applications. The designed magnetic coil system is compatible with standard inverted optical microscopes, while a digital camera for real-time image processing is already Manuscript
Integrating Optical Force Sensing with Visual Servoing for Microassembly
Journal of Intelligent and Robotic Systems, 2000
For microassembly tasks uncertainty exists at many levels. Single static sensing configurations are therefore unable to provide feedback with the necessary range and resolution for accomplishing many desired tasks. In this paper we present experimental results that investigate the integration of two disparate sensing modalities, force and vision, for sensor-based microassembly. By integrating these sensing modes, we are able to provide feedback in a task-oriented frame of reference over a broad range of motion with an extremely high precision. An optical microscope is used to provide visual feedback down to micron resolutions, while an optical beam deflection technique (based on a modified atomic force microscope) is used to provide nanonewton level force feedback or nanometric level position feedback. Visually servoed motion at speeds of up to 2 mm/s with a repeatability of 0.17 μm are achieved with vision alone. The optical beam deflection sensor complements the visual feedback by providing positional feedback with a repeatability of a few nanometers. Based on the principles of optical beam deflection, this is equivalent to force measurements on the order of a nanonewton. The value of integrating these two disparate sensing modalities is demonstrated during controlled micropart impact experiments. These results demonstrate micropart approach velocities of 80 μm/s with impact forces of 9 nN and final contact forces of 2 nN. Within our microassembly system this level of performance cannot be achieved using either sensing modality alone. This research will aid in the development of complex hybrid MEMS devices in two ways; by enabling the microassembly of more complex MEMS prototypes; and in the development of automatic assembly machines for assembling and packaging future MEMS devices that require increasingly complex assembly strategies.
Vision Based Force Sensing for Nanorobotic Manipulation
Dynamic Systems and Control, Parts A and B, 2006
Over the last decade, considerable interest has been generated in building and manipulating nanoscale structures. Applications of nanomanipulation include study of nanoparticles, molecules, DNA and viruses, and bottom-up nanoassembly. We propose a Nanomanipulation System using the Zyvex S100 nanomanipulator, which operates within a scanning electron microscope (SEM), as its primary component. The primary advantage of the S100 setup over standard scanning probe microscopy based nanomanipulators is the ability to see the object during manipulation. Relying on visual feedback alone to control the nanomanipulator is not preferable due to perceptual limitations of depth and contact within the SEM. To improve operator performance over visual feedback alone, an impedance-controlled bilateral teleoperation setup is envisioned. Lack of on-board force sensors on the S100 system is the primary hindrance in the realization of the proposed architecture. In this paper, we present a computer vision based force sensing scheme. The advantages of this sensing strategy include its low cost and lack of requirement of hardware modification(s). Force sensing is implemented using an atomic force microscopy (AFM) probe attached to the S100 end-effector. Deformation of the cantilever probe is monitored using a Hough transform based algorithm. These deformations are mapped to corresponding end-effector forces following the Euler-Bernoulli beam mechanics model. The forces thus sensed can be used to provide force-feedback to the operator through a master manipulator.
Miniature Compliant Grippers With Vision-Based Force Sensing
IEEE Transactions on Robotics, 2010
This paper is concerned with grasping biological cells in aqueous medium with miniature grippers that can also help estimate forces using vision-based displacement measurement and computation. We present the design, fabrication, and testing of three single-piece, compliant miniature grippers with parallel and angular jaw motions. Two grippers were designed using experience and intuition, while the third one was designed using topology optimization with implicit manufacturing constraints. These grippers were fabricated using different manufacturing techniques using spring steel and polydimethylsiloxane (PDMS). The grippers also serve the purpose of a force sensor. Toward this, we present a vision-based force-sensing technique by solving Cauchy's problem in elasticity using an improved algorithm. We validated this technique at the macroscale, where there was an independent method to estimate the force. In this study, the gripper was used to hold a yeast ball and a zebrafish egg cell of less than 1 mm in diameter. The forces involved were estimated to be about 30 and 10 mN for the yeast ball and the zebrafish egg cell, respectively.
Real-Time Force-Feedback Micromanipulation Using Mobile Microrobots With Colored Fiducials
IEEE Robotics and Automation Letters, 2018
This article reports the design and fabrication of wirelessly controlled mobile microrobots with an on-board visionbased 2D micro-force sensor for use in force-guided micromanipulation tasks. The micro-force sensing mobile microrobots (µFSMM) presented here exhibit compliant end-effectors with different stiffnesses and color tracking fiducials which are used in conjunction with computer vision algorithms to provide real-time micro-force feedback to the user when performing teloperated micromanipulation tasks. The stiffness of the compliant springs has been tailored to provide an in situ micro-Newton level force detection capability with sub-µN resolution for the first time to mobile microrobots. The µFSMMs can be used in the fields of mechanobiology, theranostics, and for force-guided micromanipulation tasks with delicate structures, such as biological cells.
Biologically-Inspired Microfabricated Force and Position Mechano-Sensors
Sensors and Sensing in Biology and Engineering, 2003
The aim of this paper is to discuss an ideal design procedure for biologically-inspired mechanosensors. The main steps of this procedure are the following: (1) analysis of force and position sensors in animals and humans; (2) analysis of technologies available for MEMS (Micro Electro Mechanical Systems) and; (3) design and implementation of biologically-inspired sensors in innovative mechatronic and biomechatronic systems (e.g., antropomorphic robots, prostheses, and neuroprostheses).