Force-based touch approach for volume estimation (original) (raw)
Related papers
Grasping Force Estimation Recognizing Object Slippage by Tactile Data Using Neural Network
2008 IEEE Conference on Robotics, Automation and Mechatronics, 2008
Hierarchical and wider applications of robots, manipulators, and pick and place machines are facing challenges in industrial environments due to their insufficient intelligence for appropriately recognizing objects for grasping and handling purposes. Since robots do not posses self-consciousness, estimation of adequate grasping force for individual objects by robots or manipulators is another challenge for wider applications of robots and manipulators. This article suggests a mathematical model, recently developed, for computation of scattered energy of vibrations sensed by the stylus during an object slippage in robot grippers. The model includes in it dynamic parameters like trial grasping force, object falling velocity, and geometry of object surface irregularities. It is envisaged that using the said mathematical model, with the help of robust decision making capabilities of artificial neural network (NN), a robot memory could be able to estimate appropriate/optimal grasping force for an object considering its physiomechanical properties. On the basis of above mentioned mathematical model, this article demonstrates an experimental methodology of estimating adequate grasping forces of an object by robot grippers using Backpropagation (BP) neural networks. Four different algorithms have been explored to experiment the optimal grasping force estimation.
Object Shape and Force Estimation using Deep Learning and Optical Tactile Sensor
Proceedings of The 6th IIAE International Conference on Intelligent Systems and Image Processing 2018
Touch sensing plays a necessarily essential role in robot perception. It helps robot understanding its surrounding environment and in particular the object that it interacts with. For this reasons, robots are equipped with tactile sensor. Moreover, tactile object recognition exhibits a challenge in practical scenarios. In this paper, we proposed object shape classification and force estimation based on deep learning and optical tactile sensor with one touching. This study consists of three methods. First, tactile image is selected by optical flow technique. Image augmentation is used to increase a number of image. Second, features of each image are extracted by modified VGG-16 layer. Last, object shape and force estimation classifier are multi-layer perceptron (MLP), which is a supervised learning technique. The experimental result shows that an accuracy rate is 98.9% for classifying six object shapes and 98.68% for estimating eleven force levels. The results showed that our method outperformed the previous methods that use tactile image and one touching.
Gaining a Sense of Touch. Physical Parameters Estimation using a Soft Gripper and Neural Networks
ArXiv, 2020
Soft grippers are gaining significant attention in the manipulation of elastic objects, where it is required to handle soft and unstructured objects which are vulnerable to deformations. A crucial problem is to estimate the physical parameters of a squeezed object to adjust the manipulation procedure, which is considered as a significant challenge. To the best of the authors' knowledge, there is not enough research on physical parameters estimation using deep learning algorithms on measurements from direct interaction with objects using robotic grippers. In our work, we proposed a trainable system for the regression of a stiffness coefficient and provided extensive experiments using the physics simulator environment. Moreover, we prepared the application that works in the real-world scenario. Our system can reliably estimate the stiffness of an object using the Yale OpenHand soft gripper based on readings from Inertial Measurement Units (IMUs) attached to its fingers. Additional...
Artificial Intelligence-Based Optimal Grasping Control
Sensors, 2020
A new tactile sensing module was proposed to sense the contact force and location of an object on a robot hand, which was attached on the robot finger. Three air pressure sensors are installed at the tip of the finger to detect the contacting force at the points. To obtain a nominal contact force at the finger from data from the three air pressure sensors, a force estimation was developed based upon the learning of a deep neural network. The data from the three air pressure sensors were utilized as inputs to estimate the contact force at the finger. In the tactile module, the arrival time of the air pressure sensor data has been utilized to recognize the contact point of the robot finger against an object. Using the three air pressure sensors and arrival time, the finger location can be divided into 3 × 3 block locations. The resolution of the contact point recognition was improved to 6 × 4 block locations on the finger using an artificial neural network. The accuracy and effectiven...
Featureless classification of tactile contacts in a gripper using neural networks
Sensors and Actuators A: Physical, 1997
A direct, featureless process to classify contact impressions of objects gripped by a robot hand is presented. The inforlnation about the type of contact allows the selection of the most appropriate manipulating strategy to handle the grasped object. A learning vector quantization (LVQ) network is applied introducing a contact-pattern preprocessing technique to improve the robustness of tbe classification with respect to the pattern variations in position, orientation and size. © 1997 Elsevier Science S.A.
Tactile image based contact shape recognition using neural network
This paper proposes a novel algorithm for recognizing the shape of object which in contact with a robotic finger through the tactile pressure sensing. The developed algorithm is capable of distinguishing the contact shapes between a set of low-resolution pressure map. Within this algorithm, a novel feature extraction technique is developed which transforms a pressure map into a 512----feature vector. The extracted feature of the pressure map is invariant to scale, positioning and partial occlusion, and is independent of the sensor's resolution or image size. To recognize different contact shape from a pressure map, a neural network classifier is developed and uses the feature vector as inputs. It has proven from tests of using four different contact shapes that, the trained neural network can achieve a high success rate of over 90%. Contact sensory information plays a crucial role in robotic hand gestures. The algorithm introduced in this paper has the potential to provide valuable feedback information to automate and improve robotic hand grasping and manipulation.
Gaining a Sense of Touch Object Stiffness Estimation Using a Soft Gripper and Neural Networks
2021
Soft grippers are gaining significant attention in the manipulation of elastic objects, where it is required to handle soft and unstructured objects, which are vulnerable to deformations. The crucial problem is to estimate the physical parameters of a squeezed object to adjust the manipulation procedure, which poses a significant challenge. The research on physical parameters estimation using deep learning algorithms on measurements from direct interaction with objects using robotic grippers is scarce. In our work, we proposed a trainable system which performs the regression of an object stiffness coefficient from the signals registered during the interaction of the gripper with the object. First, using the physics simulation environment, we performed extensive experiments to validate our approach. Afterwards, we prepared a system that works in a real-world scenario with real data. Our learned system can reliably estimate the stiffness of an object, using the Yale OpenHand soft grip...
Calibration of tactile/force sensors for grasping with the PRISMA Hand II
2021 IEEE/ASME International Conference on Advanced Intelligent Mechatronics (AIM), 2021
The PRISMA Hand II is a mechanically robust anthropomorphic hand developed at PRISMA Lab, University of Naples Federico II. The hand is highly underactuated, three motors drive 19 joints via elastic tendons. Thanks to its particular mechanical design, the hand can perform not only adaptive grasps but also in-hand manipulation. Each fingertip integrates a tactile/force sensor, based on optoelectronic technology, to provide tactile/force feedback during grasping and manipulation, particularly useful with deformable objects. The paper briefly describes the mechanical design and sensor technology of the hand and proposes a calibration procedure for tactile/force sensors. A comparison between different models of Neural Networks architectures, suitable for sensors calibration, is shown. Experimental tests are provided to choose the optimal tactile sensing suite. Finally, experiments for the regulation of the forces are made to show the effectiveness of calibrated sensors. Index Terms-robotic hand, force/tactile sensors, neural networks models, force regulation.
Neuromorphic Vision Based Contact-Level Classification in Robotic Grasping Applications
Sensors
In recent years, robotic sorting is widely used in the industry, which is driven by necessity and opportunity. In this paper, a novel neuromorphic vision-based tactile sensing approach for robotic sorting application is proposed. This approach has low latency and low power consumption when compared to conventional vision-based tactile sensing techniques. Two Machine Learning (ML) methods, namely, Support Vector Machine (SVM) and Dynamic Time Warping-K Nearest Neighbor (DTW-KNN), are developed to classify material hardness, object size, and grasping force. An Event-Based Object Grasping (EBOG) experimental setup is developed to acquire datasets, where 243 experiments are produced to train the proposed classifiers. Based on predictions of the classifiers, objects can be automatically sorted. If the prediction accuracy is below a certain threshold, the gripper re-adjusts and re-grasps until reaching a proper grasp. The proposed ML method achieves good prediction accuracy, which shows t...
Distinguish the Textures of Grasped Objects by Robotic Hand Using Artificial Neural-Network
Engineering and Technology Journal, 2021
Arduino microcontroller and the Matlab program are integrated to acquire sensor data. Neural-Network used as an intelligent classifier to distinguish the object softness. The object identification properties with tactile sensing are valuable in interaction with the environment for both humans and robots, and it is the core of sensing used for exploration and determining properties of objects that are inaccessible from visual perception. Object identification often involves with rigid mechanical grippers, tactile information and intelligent algorithms. This paper proposes a methodology for feature extraction techniques and discriminates objects for different softness using adaptive robotic grippers, which are equipped with force and angle sensors in each four fingers of an underactuated robot hand. Arduino microcontroller and the Matlab program are integrated to acquire sensor data and to control the gripping action. The neuralnetwork method used as an intelligent classifier to distinguish between different object softness by using feature vector acquired from the force sensor measurements and actuator positions in time series response during the grasping process using only a single closure grasping. The proposed method efficiency was validated using experimental paradigms that involving three sets of model objects and everyday life objects with various shapes, stiffness, and sizes.