Guest Editorial Behavior Understanding and Developmental Robotics (original) (raw)

Automatic Understanding of Human Behaviour for Developmental Robotics Using Machine Learning

Automatic Understanding of Human Behaviour for Developmental Robotics Using Machine Learning, 2023

With the rapid technological growth in today's world, we begin to see the necessity for automated resources such as robots in our day-today life. So, it becomes important for us to focus our attention on easing the interaction between humans and robots, by essentially finding ways how to make robots adapt to human society for better integration. For example, in areas where the population of the elderly far surpasses that of the youth, a lot of time and attention is needed to care for the elderly which becomes a burden, but with the help of an intelligent self-reliant assistant (such as a robot) capable of understanding and looking after the elderly, one can have more time for other activities. There are several possible ways of improving the adaptiveness of robots in human society. One such way is by improving the ability of robots to understand human behaviour, because as humans once you know and understand whomever you are interacting with, you can easily know how to respond to them. So, in this research, we shall analyse the possibilities for robots to properly understand human behaviour through the use of machine learning.

Human Behavior Understanding for Robotics

2012

Human behavior is complex, but structured along individual and social lines. Robotic systems interacting with people in uncontrolled environments need capabilities to correctly interpret, predict and respond to human behaviors. This paper discusses the scientific, technological and application challenges that arise from the mutual interaction of robotics and computational human behavior understanding. We supply a short survey of the area to provide a contextual framework and describe the most recent research in this area.

Going into the wild in child–robot interaction studies: issues in social robotic development

Intelligent Service Robotics, 2008

As robots move into more human centric environments we require methods to develop robots that can naturally interact with humans. Doing so requires testing in the real-world and addressing multidisciplinary challenges. Our research is focused on Child-Robot Interaction which includes very young children, for example toddlers, and children diagnosed with autism. More traditional forms of human-robot communication, such as speech or gesture recognition, may not be appropriate with these users, where as touch may help to provide a more natural and appropriate means of communication for such instances. In this paper, we present our findings on these topics obtained from a project involving a spherical robot that acquires information regarding natural touch from analysing sensory patterns over-time to characterize the information. More specifically, from this project we have derived important factors for future consideration, we describe our iterative experimental methodology of testing in and out of the 'wild' (lab based and real world), and outline discoveries that were made by doing so.

Infant-like Social Interactions between a Robot and a Human Caregiver

Adaptive Behavior, 2000

This paper presents an autonomous robot designed to interact socially with human "parents". A human infant's emotions and drives play an important role in generating meaningful interactions with the caretaker, regulating these interactions to maintain an environment suitable for the learning process, and assisting the caretaker in satisfying the infant's drives. For our purposes, the ability to regulate how intensely the caretaker engages the robot is vital to successful learning in a social context.

What should a robot learn from an infant? Mechanisms of action interpretation and observational learning in infancy

Connection Science, 2003

The paper provides a summary of our recent research on preverbal infants (using violation-of-expectation and observational learning paradigms) demonstrating that oneyear-olds interpret and draw systematic inferences about other's goal-directed actions, and can rely on such inferences when imitating other's actions or emulating their goals. To account for these findings it is proposed that oneyear-olds apply a non-mentalistic action interpretational system, the 'teleological stance' that represents actions by relating relevant aspects of reality (action, goal-state, and situational constraints) through the principle of rational action, which assumes that actions function to realize goal-states by the most efficient means available in the actor's situation. The relevance of these research findings and the proposed theoretical model for how to realize the goal of epigenetic robotics of building a 'socially relevant' humanoid robot is discussed.

Human Infant Looking and Learning Behavior in Robots

This paper presents a biologically realistic spiking neural model of the auditory, visual, and looking-control systems of a young human infant. Several behaviors that have been observed in young human infants arise when the model is used to control the head movements of a NAO humanoid robot in the same situations. It demonstrates visual habituation as measured by looking time. It also demonstrates multimodal habituation to synchronous auditory-visual stimuli during real time interaction with a human who is performing a "showing-and-naming" behavior. In-depth analysis of the evolution of the model affords greater understanding of its internal neural dynamics and how they interact with the external world that includes the robot's body and the human. Since the model is biologically based, this increases our understanding of how and why human infants behave as they do, and what can be learned from them to make better interacting robots.

Socially perceptive robots: Challenges and concerns

Social robots are those endowed with communication channels and abilities that take inspiration from human beings. The scope of such abilities should include those allowing a robot to understand people’s affective states and expressions, intentions, actions, and to interpret them based on contextual information. Childcare robots are an example of robots that could take advantage of the integration of these capabilities. This commentary conducts a technical appraisal of the notion of autonomous childcare robots, focusing on these social perceptive capabilities, reviewing some of the key challenges remaining to be investigated by the research community in this respect.

Guest Editorial A Sense of Interaction in Humans and Robots: From Visual Perception to Social Cognition

IEEE Transactions on Cognitive and Developmental Systems

Guest Editorial A Sense of Interaction in Humans and Robots: From Visual Perception to Social Cognition H UMAN ability to interact with one another is substantially strengthened by vision, with several visual processes tuned to support prosocial behaviors since early infancy. A key challenge of robotics research is to provide artificial agents with similar advanced visual perception skills, with the ultimate goal of designing machines able to recognize and interpret both explicit and implicit communication cues embedded in human behaviors. This special issue addresses this challenge, with a focus both on understanding human perception supporting interaction abilities and on the implementation perspective, considering new algorithms and modeling efforts brought forward to improve current robotics. This multidisciplinary effort aims to bring innovations not only in human-machine interaction but also in domains such as developmental psychology and cognitive rehabilitation. I. SCOPE OF THIS SPECIAL ISSUE Since early infancy, the ability of humans to interact with one another has been substantially strengthened by vision, with several visual processes tuned to support prosocial behaviors. For instance, a natural predisposition to look at human faces or to detect biological motion is present at birth [items 1) and 2) in the Appendix]. More refined abilities-as the understanding and anticipation of others' actions and intentions [items 3) and 4) in the Appendix]progressively develop with age, leading, in a few years, to a full capability of interaction based on mutual understanding, joint coordination, and collaboration. Today, a key challenge of robotics research is to provide artificial agents with similar advanced visual perception skills, with the ultimate goal of designing machines able to recognize and interpret both explicit and implicit communication cues embedded in human behaviors [items 5)-9) in the Appendix]. These achievements pave the way for the large-scale use of human-robot interaction applications in a variety of contexts, ranging from the design of personal robots, to physical, social, and cognitive rehabilitation. Understanding how efficient and seamless collaborations can be achieved among human partners and which of them are the explicit and implicit cues intuitively

Social robots in research on social and cognitive development in infants and toddlers: A Scoping review

2024

There is currently no systematic review of the growing body of literature on using social robots in early developmental research. Designing appropriate methods for early childhood research is crucial for broadening our understanding of young children's social and cognitive development. This scoping review systematically examines the existing literature on using social robots to study social and cognitive development in infants and toddlers aged between 2 and 35 months. Moreover, it aims to identify the research focus, findings, and reported gaps and challenges when using robots in research. We included empirical studies published between 1990 to May 29, 2023. We searched for literature in PsychINFO, ERIC, Web of Science, and PsyArXiv. Twenty-nine studies met the inclusion criteria and were mapped using the scoping review method. Our findings reveal that most studies were quantitative, with experimental designs conducted in a laboratory setting where children were exposed to physically present or virtual robots in a one-to-one situation. We found that robots were used to investigate four main concepts: animacy concept, action understanding, imitation, and early conversational skills. Many studies focused on whether young children regard robots as agents or social partners. The studies demonstrated that young children could learn from and understand social robots in some situations but not always. For instance, children's understanding of social robots was often facilitated by robots that behaved interactively and contingently. This scoping review highlights the need to design social robots that can engage in interactive and contingent social behaviors for early developmental research.

Toward social cognition in robotics: extracting and internalizing meaning from perception

One of the long-term objectives of artificial cognition is that robots will increasingly be capable of interacting with their human counterparts in open-ended tasks that can change over time. To achieve this end, the robot should be able to acquire and internalize new knowledge from human-robot interaction, on-line. This implies that the robot should attend and perceive the available cues, both verbal and nonverbal, that contain information about the inner qualities of the human counterparts. Social cognition focuses on the perceiver's ability to build cognitive representations of actors (emotions, intentions,. . .) and their contexts. These representations should provide meaning to the sensed inputs and mediate the behavioural responses of the robot within this social scenario. This paper describes how the abilities for building such as cognitive representations are currently endowing in the cognitive software architecture RoboCog. It also presents a first set of complete experiments, involving different user profiles. These experiments show the promising possibilities of the proposal, and reveal the main future improvements to be addressed.