Inertial parameter estimation of floating base humanoid systems using partial force sensing (original) (raw)

Fast Whole-Body Motion Control of Humanoid Robots with Inertia Constraints

We introduce a new, analytical method for generating whole-body motions for humanoid robots, which approximate the desired Composite Rigid Body (CRB) inertia. Our approach uses a reduced five mass model, where four of the masses are attributed to the limbs and one is used for the trunk. This compact formulation allows for finding an analytical solution that combines the kinematics with mass distribution and inertial properties of a humanoid robot. The positioning of the masses in Cartesian space is then directly used to obtain joint angles with relations based on simple geometry. Motions are achieved through the time evolution of poses generated through the desired foot positioning and CRB inertia properties. As a result, we achieve short computation times in the order of tens of microseconds. This makes the method suited for applications with limited computation resources, or leaving them to be spent on higher-layer tasks such as model predictive control. The approach is evaluated by performing a dynamic kicking motion with an igus Humanoid Open Platform robot.

Humanoid and Human Inertia Parameter Identification Using Hierarchical Optimization

IEEE Transactions on Robotics, 2016

We propose a method for estimation of humanoid and human links' inertial parameters. Our approach formulates the problem as a hierarchical quadratic program by exploiting the linear properties of rigid body dynamics with respect to the inertia parameters. In order to assess our algorithm, we conducted experiments with a humanoid robot and a human subject. We compared ground reaction forces and moments estimated from force measurements with those computed using identified inertia parameters and movement information. Our method is able to accurately reconstruct ground reaction forces and force moments. Moreover, our method is able to estimate correctly masses of the robots links and to accurately detect additional masses placed on the human subject during the experiments.

Inertial Parameter Identification Including Friction and Motor Dynamics

2013

Identification of inertial parameters is fundamental for the implementation of torque-based control in humanoids. At the same time, good models of friction and actuator dynamics are critical for the low-level control of joint torques. We propose a novel method to identify inertial, friction and motor parameters in a single procedure. The identification exploits the measurements of the PWM of the DC motors and a 6-axis force/torque sensor mounted inside the kinematic chain. The partial least-square (PLS) method is used to perform the regression. We identified the inertial, friction and motor parameters of the right arm of the iCub humanoid robot. We verified that the identified model can accurately predict the force/torque sensor measurements and the motor voltages. Moreover, we compared the identified parameters against the CAD parameters, in the prediction of the force/torque sensor measurements. Finally, we showed that the estimated model can effectively detect external contacts, comparing it against a tactile-based contact detection. The presented approach offers some advantages with respect to other state-of-the-art methods, because of its completeness (i.e. it identifies inertial, friction and motor parameters) and simplicity (only one data collection, with no particular requirements).

A Model-Free Approach for Accurate Joint Motion Control in Humanoid Locomotion

International Journal of Humanoid Robotics, 2011

A new model-free approach to precisely control humanoid robot joints is presented in this article. An input–output online identification procedure will permit to compensate neglected or uncertain dynamics, such as, on the one hand, transmission and compliance nonlinear effects, and, on the other hand, network transmission delays. Robustness to parameter variations will be analyzed and compared to other advanced PID-based controllers. Simulations will show that not only good tracking quality can be obtained with this novel technique, but also that it provides a very robust behavior to the closed-loop system. Furthermore, a locomotion task will be tested in a complete humanoid simulator to highlight the suitability of this control approach for such complex systems.

Robust Nonlinear State Estimation for Humanoid Robots

2020

Center of Mass (CoM) estimation realizes a crucial role in legged locomotion. Most walking pattern generators and real-time gait stabilizers commonly assume that the CoM position and velocity are available for feedback. In this thesis we present one of the first 3D-CoM state estimators for humanoid robot walking. The proposed estimation scheme fuses effectively joint encoder, inertial, and feet pressure measurements with an Extended Kalman Filter (EKF) to accurately estimate the 3D-CoM position, velocity, and external forces acting on the CoM. Furthermore, it directly considers the presence of uneven terrain and the body’s angular momentum rate and thus effectively couples the frontal with the lateral plane dynamics, without relying on feet Force/Torque (F/T) sensing. Nevertheless, it is common practice to transform the measurements to a world frame of reference and estimate the CoM with respect to the world frame. Consequently, the robot’s base and support foot pose are mandatory and need to be co-estimated. To this end, we extend a well-established in literature floating mass estimator to account for the support foot dynamics and fuse kinematic-inertial measurements with the Error State Kalman Filter (ESKF) to appropriately handle the overparametrization of rotations. In such a way, a cascade state estimation scheme consisting of a base and a CoM estimator is formed and coined State Estimation RObot Walking (SEROW). Additionally, we employ Visual Odometry (VO) and/or LIDAR Odometry (LO) measurements to correct the kinematic drift caused by slippage during walking. Unfortunately, such measurements suffer from outliers in a dynamic environment, since frequently it is assumed that only the robot is in motion and the world around is static. Thus, we introduce the Robust Gaussian ESKF (RGESKF) to automatically detect and reject outliers without relying on any prior knowledge on measurement distributions or finely tuned thresholds. Therefore, SEROW is robustified and is suitable for dynamic human environments. In order to reinforce further research endeavors, SEROW is released to the robotic community as an open-source ROS/C++ package. Up to date control and state estimation schemes readily assume that feet contact status is known a priori. Contact detection is an important and largely unexplored topic in contemporary humanoid robotics research. In this thesis, we elaborate on a broader question: in which gait phase is the robot currently in? To this end, we propose a holistic frame- work based on unsupervised learning from proprioceptive sensing that accurately and efficiently addresses this problem. More specifically, we robustly detect one of the three gait-phases, namely Left Single Support (LSS), Double Support (DS), and Right Single Support (RSS) utilizing joint encoder, IMU, and F/T measurements. Initially, dimensionality reduction with Principal Components Analysis (PCA) or autoencoders is performed to extract useful features, obtain a compact representation, and reduce the noise. Next, clustering is performed on the low-dimensional latent space with Gaussian Mixture Models (GMMs) and three dense clusters corresponding to the gait-phases are obtained. Interestingly, it is demonstrated that the gait phase dynamics are low-dimensional which is another indication pointing towards locomotion being a low dimensional skill. Accordingly, given that the proposed framework utilizes measurements from sensors that are commonly available on humanoids nowadays, we offer the Gait-phase Estimation Module (GEM), an open-source ROS/Python implementation to the robotic community. SEROW and GEM have been quantitatively and qualitatively assessed in terms of accuracy and efficiency both in simulation and under real-world conditions. Initially, a simulated robot in MATLAB and NASA’s Valkyrie humanoid robot in ROS/Gazebo were employed to establish the proposed schemes with uneven/rough terrain gaits. Subsequently, the proposed schemes were integrated on a) the small size NAO humanoid robot v4.0 and b) the adult size WALK-MAN v2.0 for experimental validation. With NAO, SEROW was implemented on the robot to provide the necessary feedback for motion planning and real- time gait stabilization to achieve omni-directional locomotion even on outdoor/uneven terrains. Additionally, SEROW was used in footstep planning and also in Visual SLAM with the same robot. Regarding WALK-MAN v2.0, SEROW was executed onboard with kinematic-inertial and F/T data to provide base and CoM feedback in real-time. Furthermore, VO has also been considered to correct the kinematic drift while walking and facilitate possible footstep planning. GEM was also employed to estimate the gait phase in WALK-MAN’s dynamic gaits. Summarizing, a robust nonlinear state estimator is proposed for humanoid robot walking. Nevertheless, this scheme can be readily extended to other type of legged robots such as quadrupeds, since they share the same fundamental principles.

State estimation for force-controlled humanoid balance using simple models in the presence of modeling error

2011 IEEE International Conference on Robotics and Automation, 2011

This paper considers the design of state estimators for dynamic balancing systems using a Linear Inverted Pendulum model with unknown modeling errors such as a center of mass measurement offset or an external force. A variety of process and output models are constructed and compared. For a system containing modeling error, it is shown that a naive estimator (one that doesn't account for this error) will result in inaccurate state estimates. These state estimators are evaluated on a force-controlled humanoid robot for a sinusoidal swaying task and a forward push recovery task.

A whole-body control framework for humanoids operating in human environments

Proceedings 2006 IEEE International Conference on Robotics and Automation, 2006. ICRA 2006.

Tomorrow's humanoids will operate in human environments, where efficient manipulation and locomotion skills, and safe contact interactions will be critical design factors. We report here our recent efforts into these issues, materialized into a whole-body control framework. This framework integrates task-oriented dynamic control and control prioritization [14] allowing to control multiple task primitives while complying with physical and movement-related constraints. Prioritization establishes a hierarchy between control spaces, assigning top priority to constraint-handling tasks, while projecting operational tasks in the null space of the constraints, and controlling the posture within the residual redundancy. This hierarchy is directly integrated at the kinematic level, allowing the program to monitor behavior feasibility at runtime. In addition, prioritization allows us to characterize the dynamic behavior of the individual control primitives subject to the constraints, and to synthesize operational space controllers at multiple levels. To complete this framework, we have developed freefloating models of the humanoid and incorporate the associated dynamics and the effects of the resulting support contacts into the control hierarchy. As part of a long term collaboration with Honda, we are currently implementing this framework into the humanoid robot Asimo.

Whole body posture controller based on inertial forces

2006 6th IEEE-RAS International Conference on Humanoid Robots, 2006

The goal of our research is to treat different humanoid locomotion modes (walking, running, etc.) with a unified approach. As a step towards this goal, we present a whole body posture controller which imposes the generalized inertial forces of the robot and the position and orientation of the extremities which are not in contact with the ground. The method, based on the Lagrangian dynamics equations, allows to treat the angular momentum, and zmp stability. To validate our approach we have carried out experiments of planar motions on the humanoid robot HRP-2.