Francesco Chinello - Academia.edu (original) (raw)

Papers by Francesco Chinello

Research paper thumbnail of Tutorial in “Frontiers in Haptic Technology and Interaction Design: the Challenges, the Technology, the Perspectives”

Technological advancement provides an increasing number and variety of solutions to interact with... more Technological advancement provides an increasing number and variety of solutions to interact with digital content. However, the complexity of the devices we use to interact with such content grows according to the users' needs as well as the complexity of the target interactions. This also includes all those tools designed to mediate touch interactions with virtual and/or remote environments, i.e., haptic interfaces and rendering techniques. We propose three hours of tutorials to discuss the technology, challenges, and perspective of haptic systems and rendering techniques for immersive human-computer interaction. CCS Concepts: • Human-centered computing → Haptic devices.

Research paper thumbnail of The KUKA Control Toolbox: motion control of KUKA robot manipulators with MATLAB

The KUKA Control Toolbox (KCT) is a collection of MATLAB functions developed at the University of... more The KUKA Control Toolbox (KCT) is a collection of MATLAB functions developed at the University of Siena, for motion control of KUKA robot manipulators. The toolbox, which is compatible with all 6 DOF small and low-payload KUKA robots that use the Eth.RSIXML, runs on a remote computer connected with the KUKA controller via TCP/IP. KCT includes more than 40 functions, spanning operations such as forward and inverse kinematics computation, point-to-point joint and Cartesian control, trajectory generation, graphical display, 3-D animation and diagnostics. Applicative examples show the flexibility of KCT and its easy interfacing with other toolboxes and external devices.

Research paper thumbnail of A Novel 3RRS Wearable Fingertip Cutaneous Device for Virtual Interaction

Haptic Interaction, Jul 8, 2017

Research paper thumbnail of Effectiveness of Virtual Versus Physical Training: The Case of Assembly Tasks, Trainer's Verbal Assistance, and Task Complexity

IEEE Computer Graphics and Applications

Research paper thumbnail of Immersive Training: Outcomes from Small Scale AR/VR Pilot-Studies

2019 IEEE Conference on Virtual Reality and 3D User Interfaces (VR)

Research paper thumbnail of A systematic review of immersive virtual reality for industrial skills training

Behaviour & Information Technology

Research paper thumbnail of Combining Wearable Finger Haptics and Augmented Reality: User Evaluation Using an External Camera and the Microsoft HoloLens

IEEE Robotics and Automation Letters

Research paper thumbnail of Design and Evaluation of a Wearable Skin Stretch Device for Haptic Guidance

IEEE Robotics and Automation Letters

We present a wearable skin stretch device for the forearm. It is composed of four cylindrical end... more We present a wearable skin stretch device for the forearm. It is composed of four cylindrical end effectors, evenly distributed around the user's forearm. They can generate independent skin stretch stimuli at the palmar, dorsal, ulnar, and radial sides of the arm. When the four end effectors rotate in the same direction, the wearable device provides cutaneous stimuli about a desired pronation/supination of the forearm. On the other hand, when two opposite end effectors rotate in different directions, the cutaneous device provides cutaneous stimuli about a desired translation of the forearm. To evaluate the effectiveness of our device in providing navigation information, we carried out two experiments of haptic navigation. In the first one, subjects were asked to translate and rotate the forearm toward a target position and orientation, respectively. In the second experiment, subjects were asked to control a 6-DoF robotic manipulator to grasp and lift a target object. Haptic feedback provided by our wearable device improved the performance of both experiments with respect to providing no haptic feedback. Moreover, it showed similar performance with respect to sensory substitution via visual feedback, without overloading the visual channel.

Research paper thumbnail of Human-Robot Team Interaction Through Wearable Haptics for Cooperative Manipulation

IEEE Transactions on Haptics

The interaction of robot teams and single human in teleoperation scenarios is beneficial in coope... more The interaction of robot teams and single human in teleoperation scenarios is beneficial in cooperative tasks, for example the manipulation of heavy and large objects in remote or dangerous environments. The main control challenge of the interaction is its asymmetry, arising because robot teams have a relatively high number of controllable degrees of freedom compared to the human operator. Therefore, we propose a control scheme that establishes the interaction on spaces of reduced dimensionality taking into account the low number of human command and feedback signals imposed by haptic devices. We evaluate the suitability of wearable haptic fingertip devices for multi-contact teleoperation in a user study. The results show that the proposed control approach is appropriate for human-robot team interaction and that the wearable haptic fingertip devices provide suitable assistance in cooperative manipulation tasks.

Research paper thumbnail of A modular wearable finger interface for cutaneous and kinesthetic interaction: control and evaluation

IEEE Transactions on Industrial Electronics

Research paper thumbnail of Linear Integration of Tactile and Non-tactile Inputs Mediates Estimation of Fingertip Relative Position

Frontiers in Neuroscience

While skin, joints and muscles receptors alone provide lower level information about individual v... more While skin, joints and muscles receptors alone provide lower level information about individual variables (e.g., exerted limb force and limb displacement), the distance between limb endpoints (i.e., relative position) has to be extracted from high level integration of somatosensory and motor signals. In particular, estimation of fingertip relative position likely involves more complex sensorimotor transformations than those underlying hand or arm position sense: the brain has to estimate where each fingertip is relative to the hand and where fingertips are relative to each other. It has been demonstrated that during grasping, feedback of digit position drives rapid adjustments of fingers force control. However, it has been shown that estimation of fingertips' relative position can be biased by digit forces. These findings raise the question of how the brain combines concurrent tactile (i.e., cutaneous mechanoreceptors afferents induced by skin pressure and stretch) and non-tactile (i.e., both descending motor command and joint/muscle receptors signals associated to muscle contraction) digit force-related inputs for fingertip distance estimation. Here we addressed this question by quantifying the contribution of tactile and non-tactile force-related inputs for the estimation of fingertip relative position. We asked subjects to match fingertip vertical distance relying only on either tactile or non-tactile inputs from the thumb and index fingertip, and compared their performance with the condition where both types of inputs were combined. We found that (a) the bias in the estimation of fingertip distance persisted when tactile inputs and non-tactile force-related signals were presented in isolation; (b) tactile signals contributed the most to the estimation of fingertip distance; (c) linear summation of the matching errors relying only on either tactile or non-tactile inputs was comparable to the matching error when both inputs were simultaneously available. These findings reveal a greater role of tactile signals for sensing fingertip distance and suggest a linear integration mechanism with non-tactile inputs for the estimation of fingertip relative position.

Research paper thumbnail of A Three Revolute-Revolute-Spherical wearable fingertip cutaneous device for stiffness rendering

IEEE Transactions on Haptics

We present a novel three Revolute-Revolute-Spherical (3RRS) wearable fingertip device for the ren... more We present a novel three Revolute-Revolute-Spherical (3RRS) wearable fingertip device for the rendering of stiffness information. It is composed of a static upper body and a mobile end-effector. The upper body is located on the nail side of the finger, supporting three small servo motors, and the mobile end-effector is in contact with the finger pulp. The two parts are connected by three articulated legs, actuated by the motors. The end-effector can move toward the user's fingertip and rotate it to simulate contacts with arbitrarily-oriented surfaces. Moreover, a vibrotactile motor placed below the end-effector conveys vibrations to the fingertip. The proposed device weights 25 g for 35×50×48 mm dimensions. To test the effectiveness of our wearable haptic device and its level of wearability, we carried out two experiments, enrolling thirty human subjects in total. The first experiment tested the capability of our device in differentiating stiffness information, while the second one focused on evaluating its applicability in an immersive virtual reality scenario. Results showed the effectiveness of the proposed wearable solution, with a JND for stiffness of 208.5 ± 17.2 N/m. Moreover, all subjects preferred the virtual interaction experience when provided with wearable cutaneous feedback, even if results also showed that subjects found our device still a bit difficult to use.

Research paper thumbnail of Evaluation of wearable haptic systems for the fingers in Augmented Reality applications

IEEE transactions on haptics, Jan 5, 2017

Although Augmented Reality (AR) has been around for almost five decades, only recently we have wi... more Although Augmented Reality (AR) has been around for almost five decades, only recently we have witnessed AR systems and applications entering in our everyday life. Representative examples of this technological revolution are the smartphone games "Pok´emon GO" and "Ingress" or the Google Translate real-time sign interpretation app. Even if AR applications are already quite compelling and widespread, users are still not able to physically interact with the computer-generated reality. In this respect, wearable haptics can provide the compelling illusion of touching the superimposed virtual objects without constraining the motion or the workspace of the user. In this paper, we present the experimental evaluation of two wearable haptic interfaces for the fingers in three AR scenarios, enrolling 38 participants. In the first experiment, subjects were requested to write on a virtual board using a real chalk. The haptic devices provided the interaction forces between the...

Research paper thumbnail of Optimization-Based Wearable Tactile Rendering

IEEE transactions on haptics, Jan 20, 2016

Novel wearable tactile interfaces offer the possibility to simulate tactile interactions with vir... more Novel wearable tactile interfaces offer the possibility to simulate tactile interactions with virtual environments directly on our skin. But, unlike kinesthetic interfaces, for which haptic rendering is a well explored problem, they pose new questions about the formulation of the rendering problem. In this work, we propose a formulation of tactile rendering as an optimization problem, which is general for a large family of tactile interfaces. Based on an accurate simulation of contact between a finger model and the virtual environment, we pose tactile rendering as the optimization of the device configuration, such that the contact surface between the device and the actual finger matches as close as possible the contact surface in the virtual environment. We describe the optimization formulation in general terms, and we also demonstrate its implementation on a thimble-like wearable device. We validate the tactile rendering formulation by analyzing its force error, and we show that it...

Research paper thumbnail of Design and development of a 3RRS wearable fingertip cutaneous device

2015 IEEE International Conference on Advanced Intelligent Mechatronics (AIM), 2015

Research paper thumbnail of Soft finger tactile rendering for wearable haptics

2015 IEEE World Haptics Conference (WHC), 2015

Research paper thumbnail of RACT: A Remote Lab for Robotics Experiments

Proceedings of the 17th IFAC World Congress, 2008, 2008

The "Robotics & Automatic Control Telelab" (RACT) is a remote laboratory on robotics developed at... more The "Robotics & Automatic Control Telelab" (RACT) is a remote laboratory on robotics developed at University of Siena, which extends the field of application of the "Automatic Control Telelab" (ACT). This extension consists of adding experiments on a remote robot manipulator. RACT is mainly intended for educational use, and its Matlab-based architecture allows students to easily put in practice their theoretical knowledge on robotics. The first implementation of RACT consists of a remote experiment on inverse kinematics and of an experiment on visual servoing. Experiments on visual servoing represent the most advanced feature of the remote lab and work is in progress to add more experiments of this type.

Research paper thumbnail of Motion Control of Robot Manipulators with MATLAB

Research paper thumbnail of An object-based mapping algorithm to control wearable robotic extra-fingers

2014 IEEE/ASME International Conference on Advanced Intelligent Mechatronics, 2014

One of the new targets of wearable robots is not to enhance the lift strength far above human cap... more One of the new targets of wearable robots is not to enhance the lift strength far above human capability by wearing a bulky robot, but to support human capability within its range by wearing lightweight and compact robots. A new approach regarding robotic extra-fingers is presented here. In particular, an object-based mapping algorithm is proposed to control the robotic extra-fingers by interpreting the whole or a part of the hand motion in grasping and manipulation tasks. As a case study, the model and control of an additional robotic finger is presented. The robotic finger has been placed on the wrist opposite to the hand palm. This solution enlarges the hand workspace, increasing the grasp capability of the user. The proposed mapping algorithm do not require the human operator to activate explicit commands. Rather, the motion of the extra-fingers is connected to the human hand so that the user can perceive the robotic fingers as an extension of his body.

Research paper thumbnail of Vibrotactile haptic feedback for human-robot interaction in leader-follower tasks

Proceedings of the 5th International Conference on PErvasive Technologies Related to Assistive Environments - PETRA '12, 2012

Research paper thumbnail of Tutorial in “Frontiers in Haptic Technology and Interaction Design: the Challenges, the Technology, the Perspectives”

Technological advancement provides an increasing number and variety of solutions to interact with... more Technological advancement provides an increasing number and variety of solutions to interact with digital content. However, the complexity of the devices we use to interact with such content grows according to the users' needs as well as the complexity of the target interactions. This also includes all those tools designed to mediate touch interactions with virtual and/or remote environments, i.e., haptic interfaces and rendering techniques. We propose three hours of tutorials to discuss the technology, challenges, and perspective of haptic systems and rendering techniques for immersive human-computer interaction. CCS Concepts: • Human-centered computing → Haptic devices.

Research paper thumbnail of The KUKA Control Toolbox: motion control of KUKA robot manipulators with MATLAB

The KUKA Control Toolbox (KCT) is a collection of MATLAB functions developed at the University of... more The KUKA Control Toolbox (KCT) is a collection of MATLAB functions developed at the University of Siena, for motion control of KUKA robot manipulators. The toolbox, which is compatible with all 6 DOF small and low-payload KUKA robots that use the Eth.RSIXML, runs on a remote computer connected with the KUKA controller via TCP/IP. KCT includes more than 40 functions, spanning operations such as forward and inverse kinematics computation, point-to-point joint and Cartesian control, trajectory generation, graphical display, 3-D animation and diagnostics. Applicative examples show the flexibility of KCT and its easy interfacing with other toolboxes and external devices.

Research paper thumbnail of A Novel 3RRS Wearable Fingertip Cutaneous Device for Virtual Interaction

Haptic Interaction, Jul 8, 2017

Research paper thumbnail of Effectiveness of Virtual Versus Physical Training: The Case of Assembly Tasks, Trainer's Verbal Assistance, and Task Complexity

IEEE Computer Graphics and Applications

Research paper thumbnail of Immersive Training: Outcomes from Small Scale AR/VR Pilot-Studies

2019 IEEE Conference on Virtual Reality and 3D User Interfaces (VR)

Research paper thumbnail of A systematic review of immersive virtual reality for industrial skills training

Behaviour & Information Technology

Research paper thumbnail of Combining Wearable Finger Haptics and Augmented Reality: User Evaluation Using an External Camera and the Microsoft HoloLens

IEEE Robotics and Automation Letters

Research paper thumbnail of Design and Evaluation of a Wearable Skin Stretch Device for Haptic Guidance

IEEE Robotics and Automation Letters

We present a wearable skin stretch device for the forearm. It is composed of four cylindrical end... more We present a wearable skin stretch device for the forearm. It is composed of four cylindrical end effectors, evenly distributed around the user's forearm. They can generate independent skin stretch stimuli at the palmar, dorsal, ulnar, and radial sides of the arm. When the four end effectors rotate in the same direction, the wearable device provides cutaneous stimuli about a desired pronation/supination of the forearm. On the other hand, when two opposite end effectors rotate in different directions, the cutaneous device provides cutaneous stimuli about a desired translation of the forearm. To evaluate the effectiveness of our device in providing navigation information, we carried out two experiments of haptic navigation. In the first one, subjects were asked to translate and rotate the forearm toward a target position and orientation, respectively. In the second experiment, subjects were asked to control a 6-DoF robotic manipulator to grasp and lift a target object. Haptic feedback provided by our wearable device improved the performance of both experiments with respect to providing no haptic feedback. Moreover, it showed similar performance with respect to sensory substitution via visual feedback, without overloading the visual channel.

Research paper thumbnail of Human-Robot Team Interaction Through Wearable Haptics for Cooperative Manipulation

IEEE Transactions on Haptics

The interaction of robot teams and single human in teleoperation scenarios is beneficial in coope... more The interaction of robot teams and single human in teleoperation scenarios is beneficial in cooperative tasks, for example the manipulation of heavy and large objects in remote or dangerous environments. The main control challenge of the interaction is its asymmetry, arising because robot teams have a relatively high number of controllable degrees of freedom compared to the human operator. Therefore, we propose a control scheme that establishes the interaction on spaces of reduced dimensionality taking into account the low number of human command and feedback signals imposed by haptic devices. We evaluate the suitability of wearable haptic fingertip devices for multi-contact teleoperation in a user study. The results show that the proposed control approach is appropriate for human-robot team interaction and that the wearable haptic fingertip devices provide suitable assistance in cooperative manipulation tasks.

Research paper thumbnail of A modular wearable finger interface for cutaneous and kinesthetic interaction: control and evaluation

IEEE Transactions on Industrial Electronics

Research paper thumbnail of Linear Integration of Tactile and Non-tactile Inputs Mediates Estimation of Fingertip Relative Position

Frontiers in Neuroscience

While skin, joints and muscles receptors alone provide lower level information about individual v... more While skin, joints and muscles receptors alone provide lower level information about individual variables (e.g., exerted limb force and limb displacement), the distance between limb endpoints (i.e., relative position) has to be extracted from high level integration of somatosensory and motor signals. In particular, estimation of fingertip relative position likely involves more complex sensorimotor transformations than those underlying hand or arm position sense: the brain has to estimate where each fingertip is relative to the hand and where fingertips are relative to each other. It has been demonstrated that during grasping, feedback of digit position drives rapid adjustments of fingers force control. However, it has been shown that estimation of fingertips' relative position can be biased by digit forces. These findings raise the question of how the brain combines concurrent tactile (i.e., cutaneous mechanoreceptors afferents induced by skin pressure and stretch) and non-tactile (i.e., both descending motor command and joint/muscle receptors signals associated to muscle contraction) digit force-related inputs for fingertip distance estimation. Here we addressed this question by quantifying the contribution of tactile and non-tactile force-related inputs for the estimation of fingertip relative position. We asked subjects to match fingertip vertical distance relying only on either tactile or non-tactile inputs from the thumb and index fingertip, and compared their performance with the condition where both types of inputs were combined. We found that (a) the bias in the estimation of fingertip distance persisted when tactile inputs and non-tactile force-related signals were presented in isolation; (b) tactile signals contributed the most to the estimation of fingertip distance; (c) linear summation of the matching errors relying only on either tactile or non-tactile inputs was comparable to the matching error when both inputs were simultaneously available. These findings reveal a greater role of tactile signals for sensing fingertip distance and suggest a linear integration mechanism with non-tactile inputs for the estimation of fingertip relative position.

Research paper thumbnail of A Three Revolute-Revolute-Spherical wearable fingertip cutaneous device for stiffness rendering

IEEE Transactions on Haptics

We present a novel three Revolute-Revolute-Spherical (3RRS) wearable fingertip device for the ren... more We present a novel three Revolute-Revolute-Spherical (3RRS) wearable fingertip device for the rendering of stiffness information. It is composed of a static upper body and a mobile end-effector. The upper body is located on the nail side of the finger, supporting three small servo motors, and the mobile end-effector is in contact with the finger pulp. The two parts are connected by three articulated legs, actuated by the motors. The end-effector can move toward the user's fingertip and rotate it to simulate contacts with arbitrarily-oriented surfaces. Moreover, a vibrotactile motor placed below the end-effector conveys vibrations to the fingertip. The proposed device weights 25 g for 35×50×48 mm dimensions. To test the effectiveness of our wearable haptic device and its level of wearability, we carried out two experiments, enrolling thirty human subjects in total. The first experiment tested the capability of our device in differentiating stiffness information, while the second one focused on evaluating its applicability in an immersive virtual reality scenario. Results showed the effectiveness of the proposed wearable solution, with a JND for stiffness of 208.5 ± 17.2 N/m. Moreover, all subjects preferred the virtual interaction experience when provided with wearable cutaneous feedback, even if results also showed that subjects found our device still a bit difficult to use.

Research paper thumbnail of Evaluation of wearable haptic systems for the fingers in Augmented Reality applications

IEEE transactions on haptics, Jan 5, 2017

Although Augmented Reality (AR) has been around for almost five decades, only recently we have wi... more Although Augmented Reality (AR) has been around for almost five decades, only recently we have witnessed AR systems and applications entering in our everyday life. Representative examples of this technological revolution are the smartphone games "Pok´emon GO" and "Ingress" or the Google Translate real-time sign interpretation app. Even if AR applications are already quite compelling and widespread, users are still not able to physically interact with the computer-generated reality. In this respect, wearable haptics can provide the compelling illusion of touching the superimposed virtual objects without constraining the motion or the workspace of the user. In this paper, we present the experimental evaluation of two wearable haptic interfaces for the fingers in three AR scenarios, enrolling 38 participants. In the first experiment, subjects were requested to write on a virtual board using a real chalk. The haptic devices provided the interaction forces between the...

Research paper thumbnail of Optimization-Based Wearable Tactile Rendering

IEEE transactions on haptics, Jan 20, 2016

Novel wearable tactile interfaces offer the possibility to simulate tactile interactions with vir... more Novel wearable tactile interfaces offer the possibility to simulate tactile interactions with virtual environments directly on our skin. But, unlike kinesthetic interfaces, for which haptic rendering is a well explored problem, they pose new questions about the formulation of the rendering problem. In this work, we propose a formulation of tactile rendering as an optimization problem, which is general for a large family of tactile interfaces. Based on an accurate simulation of contact between a finger model and the virtual environment, we pose tactile rendering as the optimization of the device configuration, such that the contact surface between the device and the actual finger matches as close as possible the contact surface in the virtual environment. We describe the optimization formulation in general terms, and we also demonstrate its implementation on a thimble-like wearable device. We validate the tactile rendering formulation by analyzing its force error, and we show that it...

Research paper thumbnail of Design and development of a 3RRS wearable fingertip cutaneous device

2015 IEEE International Conference on Advanced Intelligent Mechatronics (AIM), 2015

Research paper thumbnail of Soft finger tactile rendering for wearable haptics

2015 IEEE World Haptics Conference (WHC), 2015

Research paper thumbnail of RACT: A Remote Lab for Robotics Experiments

Proceedings of the 17th IFAC World Congress, 2008, 2008

The "Robotics & Automatic Control Telelab" (RACT) is a remote laboratory on robotics developed at... more The "Robotics & Automatic Control Telelab" (RACT) is a remote laboratory on robotics developed at University of Siena, which extends the field of application of the "Automatic Control Telelab" (ACT). This extension consists of adding experiments on a remote robot manipulator. RACT is mainly intended for educational use, and its Matlab-based architecture allows students to easily put in practice their theoretical knowledge on robotics. The first implementation of RACT consists of a remote experiment on inverse kinematics and of an experiment on visual servoing. Experiments on visual servoing represent the most advanced feature of the remote lab and work is in progress to add more experiments of this type.

Research paper thumbnail of Motion Control of Robot Manipulators with MATLAB

Research paper thumbnail of An object-based mapping algorithm to control wearable robotic extra-fingers

2014 IEEE/ASME International Conference on Advanced Intelligent Mechatronics, 2014

One of the new targets of wearable robots is not to enhance the lift strength far above human cap... more One of the new targets of wearable robots is not to enhance the lift strength far above human capability by wearing a bulky robot, but to support human capability within its range by wearing lightweight and compact robots. A new approach regarding robotic extra-fingers is presented here. In particular, an object-based mapping algorithm is proposed to control the robotic extra-fingers by interpreting the whole or a part of the hand motion in grasping and manipulation tasks. As a case study, the model and control of an additional robotic finger is presented. The robotic finger has been placed on the wrist opposite to the hand palm. This solution enlarges the hand workspace, increasing the grasp capability of the user. The proposed mapping algorithm do not require the human operator to activate explicit commands. Rather, the motion of the extra-fingers is connected to the human hand so that the user can perceive the robotic fingers as an extension of his body.

Research paper thumbnail of Vibrotactile haptic feedback for human-robot interaction in leader-follower tasks

Proceedings of the 5th International Conference on PErvasive Technologies Related to Assistive Environments - PETRA '12, 2012