Human–robot interaction (original) (raw)
Study of reaction between humans and robots
Human–robot interaction (HRI) is the study of interactions between humans and robots. Human–robot interaction is a multidisciplinary field with contributions from human–computer interaction, artificial intelligence, robotics, natural language processing, design, psychology and philosophy. A subfield known as physical human–robot interaction (pHRI) has tended to focus on device design to enable people to safely interact with robotic systems.[1]
Human–robot interaction has been a topic of both science fiction and academic speculation even before any robots existed. Because much of active HRI development depends on natural language processing, many aspects of HRI are continuations of human communications, a field of research which is much older than robotics.
The origin of HRI as a discrete problem was stated by 20th-century author Isaac Asimov in 1941, in his novel I, Robot. Asimov coined Three Laws of Robotics, namely:
- A robot may not injure a human being or, through inaction, allow a human being to come to harm.
- A robot must obey the orders given it by human beings except where such orders would conflict with the First Law.
- A robot must protect its own existence as long as such protection does not conflict with the First or Second Laws.[2]
These three laws provide an overview of the goals engineers and researchers hold for safety in the HRI field, although the fields of robot ethics and machine ethics are more complex than these three principles. However, generally human–robot interaction prioritizes the safety of humans that interact with potentially dangerous robotics equipment. Solutions to this problem range from the philosophical approach of treating robots as ethical agents (individuals with moral agency), to the practical approach of creating safety zones. These safety zones use technologies such as lidar to detect human presence or physical barriers to protect humans by preventing any contact between machine and operator.[3]
Although initially robots in the human–robot interaction field required some human intervention to function, research has expanded this to the extent that fully autonomous systems are now far more common than in the early 2000s.[4] Autonomous systems include from simultaneous localization and mapping systems which provide intelligent robot movement to natural-language processing and natural-language generation systems which allow for natural, human-esque interaction which meet well-defined psychological benchmarks.[5]
Anthropomorphic robots (machines which imitate human body structure) are better described by the biomimetics field, but overlap with HRI in many research applications. Examples of robots which demonstrate this trend include Willow Garage's PR2 robot, the NASA Robonaut, and Honda ASIMO. However, robots in the human–robot interaction field are not limited to human-like robots: Paro and Kismet are both robots designed to elicit emotional response from humans, and so fall into the category of human–robot interaction.[6]
Goals in HRI range from industrial manufacturing through Cobots, medical technology through rehabilitation, autism intervention, and elder care devices, entertainment, human augmentation, and human convenience.[7] Future research therefore covers a wide range of fields, much of which focuses on assistive robotics, robot-assisted search-and-rescue, and space exploration.[8]
The goal of friendly human–robot interactions
[edit]
Kismet can produce a range of facial expressions.
Robots are artificial agents with capacities of perception and action in the physical world often referred by researchers as workspace. Their use has been generalized in factories but nowadays they tend to be found in the most technologically advanced societies in such critical domains as search and rescue, military battle, mine and bomb detection, scientific exploration, law enforcement, entertainment and hospital care.
These new domains of applications imply a closer interaction with the user. The concept of closeness is to be taken in its full meaning, robots and humans share the workspace but also share goals in terms of task achievement. This close interaction needs new theoretical models, on one hand for the robotics scientists who work to improve the robots utility and safety and on the other hand to evaluate the risks and benefits of this new "friend" for our modern society. The subfield of physical human–robot interaction (pHRI) has largely focused on device design to enable people to safely interact with robotic systems, but is increasingly developing algorithmic approaches in an attempt to support fluent and expressive interactions between humans and robotic systems.[1]
With the advance in AI, the research is focusing on one part towards the safest physical interaction but also on a socially correct interaction, dependent on cultural criteria. The goal is to build an intuitive, and easy communication with the robot through speech, gestures, and facial expressions.
Kerstin Dautenhahn refers to friendly Human–robot interaction as "Robotiquette" defining it as the "social rules for robot behaviour (a 'robotiquette') that is comfortable and acceptable to humans"[9] The robot has to adapt itself to our way of expressing desires and orders and not the contrary. But every day environments such as homes have much more complex social rules than those implied by factories or even military environments. Thus, the robot needs perceiving and understanding capacities to build dynamic models of its surroundings. It needs to categorize objects, recognize and locate humans and further recognize their emotions. The need for dynamic capacities pushes forward every sub-field of robotics.
Furthermore, by understanding and perceiving social cues, robots can enable collaborative scenarios with humans. For example, with the rapid rise of personal fabrication machines such as desktop 3d printers, laser cutters, etc., entering our homes, scenarios may arise where robots can collaboratively share control, co-ordinate and achieve tasks together. Industrial robots have already been integrated into industrial assembly lines and are collaboratively working with humans. The social impact of such robots have been studied[10] and has indicated that workers still treat robots and social entities, rely on social cues to understand and work together.
On the other end of HRI research the cognitive modelling of the "relationship" between human and the robots benefits the psychologists and robotic researchers the user study are often of interests on both sides. This research endeavours part of human society. For effective human – humanoid robot interaction[11] numerous communication skills[12] and related features should be implemented in the design of such artificial agents/systems.
General HRI research
[edit]
HRI research spans a wide range of fields, some general to the nature of HRI.
Methods for perceiving humans
[edit]
Methods for perceiving humans in the environment are based on sensor information. Research on sensing components and software led by Microsoft provide useful results for extracting the human kinematics (see Kinect). An example of older technique is to use colour information for example the fact that for light skinned people the hands are lighter than the clothes worn. In any case a human modelled a priori can then be fitted to the sensor data. The robot builds or has (depending on the level of autonomy the robot has) a 3D mapping of its surroundings to which is assigned the humans locations.
Most methods intend to build a 3D model through vision of the environment. The proprioception sensors permit the robot to have information over its own state. This information is relative to a reference. Theories of proxemics may be used to perceive and plan around a person's personal space.
A speech recognition system is used to interpret human desires or commands. By combining the information inferred by proprioception, sensor and speech the human position and state (standing, seated). In this matter, natural-language processing is concerned with the interactions between computers and human (natural) languages, in particular how to program computers to process and analyze large amounts of natural-language data. For instance, neural-network architectures and learning algorithms that can be applied to various natural-language processing tasks including part-of-speech tagging, chunking, named-entity recognition, and semantic role labeling.[13]
Methods for motion planning
[edit]
Motion planning in dynamic environments is a challenge that can at the moment only be achieved for robots with 3 to 10 degrees of freedom. Humanoid robots or even 2 armed robots, which can have up to 40 degrees of freedom, are unsuited for dynamic environments with today's technology. However lower-dimensional robots can use the potential field method to compute trajectories which avoid collisions with humans.
Cognitive models and theory of mind
[edit]
Humans exhibit negative social and emotional responses as well as decreased trust toward some robots that closely, but imperfectly, resemble humans; this phenomenon has been termed the "Uncanny Valley."[14] However recent research in telepresence robots has established that mimicking human body postures and expressive gestures has made the robots likeable and engaging in a remote setting.[15] Further, the presence of a human operator was felt more strongly when tested with an android or humanoid telepresence robot than with normal video communication through a monitor.[16]
While there is a growing body of research about users' perceptions and emotions towards robots, we are still far from a complete understanding. Only additional experiments will determine a more precise model.
Based on past research, we have some indications about current user sentiment and behavior around robots:[17][18]
- During initial interactions, people are more uncertain, anticipate less social presence, and have fewer positive feelings when thinking about interacting with robots, and prefer to communicate with a human. This finding has been called the human-to-human interaction script.
- It has been observed that when the robot performs a proactive behaviour and does not respect a "safety distance" (by penetrating the user space) the user sometimes expresses fear. This fear response is person-dependent.
- It has also been shown that when a robot has no particular use, negative feelings are often expressed. The robot is perceived as useless and its presence becomes annoying.
- People have also been shown to attribute personality characteristics to the robot that were not implemented in software.
- People similarly infer the mental states of both humans and robots, except for when robots and humans use non-literal language (such as sarcasm or white lies).[19]
- In line with the contact hypothesis,[20] supervised exposure to a social robot can decrease uncertainty and increase willingness to interact with the robot, compared to pre-exposure attitudes toward robots as a class of agents.[21]
- Interacting with a robot by looking at or touching the robot can reduce negative feelings that some people have about robots before interacting with them. Even imagined interaction can reduce negative feelings. However, in some cases, interacting with a robot can increase negative feelings for people with strong pre-existing negative sentiments towards robots.[22]
Methods for human–robot coordination
[edit]
A large body of work in the field of human–robot interaction has looked at how humans and robots may better collaborate. The primary social cue for humans while collaborating is the shared perception of an activity, to this end researchers have investigated anticipatory robot control through various methods including: monitoring the behaviors of human partners using eye tracking, making inferences about human task intent, and proactive action on the part of the robot.[23] The studies revealed that the anticipatory control helped users perform tasks faster than with reactive control alone.
A common approach to program social cues into robots is to first study human–human behaviors and then transfer the learning.[24] For example, coordination mechanisms in human–robot collaboration[25] are based on work in neuroscience[26] which examined how to enable joint action in human–human configuration by studying perception and action in a social context rather than in isolation. These studies have revealed that maintaining a shared representation of the task is crucial for accomplishing tasks in groups. For example, the authors have examined the task of driving together by separating responsibilities of acceleration and braking i.e., one person is responsible for accelerating and the other for braking; the study revealed that pairs reached the same level of performance as individuals only when they received feedback about the timing of each other's actions. Similarly, researchers have studied the aspect of human–human handovers with household scenarios like passing dining plates in order to enable an adaptive control of the same in human–robot handovers.[27] Another study in the domain of Human Factors and Ergonomics of human–human handovers in warehouses and supermarkets reveal that Givers and Receivers perceive handover tasks differently which has significant implications for designing user-centric human–robot collaborative systems.[28] Most recently, researchers have studied a system that automatically distributes assembly tasks among co-located workers to improve co-ordination.[29]
Robots used for research in HRI
[edit]
Some research involved designing a new robot while others use available robots to conduct study. Some commonly used robots are Nao, a humanoid and programmable robot. Pepper, another social humanoid robot, and Misty, a programmable companion robot.
This Nao Robot is often used for HRI research as well as other HRI applications.
The majority of robots are of a white color, stemming from a bias against robots of other colors.[30] [31][32][33][34]
The application areas of human–robot interaction include robotic technologies that are used by humans for industry, medicine, and companionship, among other purposes.
This is an example of industrial collaborative robot, Sawyer, on the factory floor working alongside humans.
Industrial robots have been implemented to collaborate with humans to perform industrial manufacturing tasks. While humans have the flexibility and the intelligence to consider different approaches to solve the problem, choose the best option among all choices, and then command robots to perform assigned tasks, robots are able to be more precise and more consistent in performing repetitive and dangerous work.[35] Together, the collaboration of industrial robots and humans demonstrates that robots have the capabilities to ensure efficiency of manufacturing and assembling.[35] However, there are persistent concerns about the safety of human–robot collaboration, since industrial robots have the ability to move heavy objects and operate often dangerous and sharp tools, quickly and with force. As a result, this presents a potential threat to the people who work in the same workspace.[35] Therefore, the planning of safe and effective layouts for collaborative workplaces is one of the most challenging topics that research faces.[36]
Researchers from the University at Texas demonstrated a rehabilitation robot in helping hand movements.
A rehabilitation robot is an example of a robot-aided system implemented in health care. This type of robot would aid stroke survivors or individuals with neurological impairment to recover their hand and finger movements.[37][38] In the past few decades, the idea of how human and robot interact with each other is one factor that has been widely considered in the design of rehabilitation robots.[38] For instance, human–robot interaction plays an important role in designing exoskeleton rehabilitation robots since the exoskeleton system makes direct contact with humans' body.[37]
Elder care and companion robot
[edit]
Paro, a therapeutic robot intended for use in hospitals and nursing homes
Nursing robots are aimed to provide assistance to elderly people who may have faced a decline in physical and cognitive function, and, consequently, developed psychosocial issues.[39] By assisting in daily physical activities, physical assistance from the robots would allow the elderly to have a sense of autonomy and feel that they are still able to take care of themselves and stay in their own homes.[39]
Long-term research on human-robot interaction could show that residents of care home are willing to interact with humanoid robots and benefit from cognitive and physical activation that is led by the robot Pepper.[40] Another long-term study in a care home could show that people working in the care sector are willing to use robots in their daily work with the residents.[41] But it also revealed that even though that the robots are ready to be used, they do need human assistants, they cannot replace the human work force but they can assist them and give them new possibilities.[41]
This is an exhibition at the Science Museum, London that demonstrates robot toys for children with autism, in hopes for helping autistic children to pick up social cues from the facial expression.[42]
Autism intervention
[edit]
Over the past decade, human–robot interaction has shown promising outcomes in autism intervention.[43] Children with autism spectrum disorders (ASD) are more likely to connect with robots than humans, and using social robots is considered to be a beneficial approach to help these children with ASD.[43]
However, social robots that are used to intervene in children's ASD are not viewed as viable treatment by clinical communities because the study of using social robots in ASD intervention, often, does not follow standard research protocol.[43] In addition, the outcome of the research could not demonstrate a consistent positive effect that could be considered as evidence-based practice (EBP) based on the clinical systematic evaluation.[43] As a result, the researchers have started to establish guidelines which suggest how to conduct studies with robot-mediated intervention and hence produce reliable data that could be treated as EBP that would allow clinicians to choose to use robots in ASD intervention.[43]
Education robots
Robots can become tutors or peers in the classroom.[44] When acting as a tutor, the robot can provide instruction, information and also individual attention to student. When acting as a peer learner, the robot can enable "learning by teaching" for students.[45]
Robots can be configured as collaborative robot and can be used for rehabilitation of users with motor impairment. Using various interactive technologies like automatic speech recognition, eye gaze tracking and so on, users with motor impairment can control robotic agents and use it for rehabilitation activities like powered wheelchair control, object manipulation and so on.
A specific example of human–robot interaction is the human-vehicle interaction in automated driving. The goal of human-vehicle cooperation is to ensure safety, security, and comfort in automated driving systems.[46] The continued improvement in this system and the progress in advancements towards highly and fully automated vehicles aim to make the driving experience safer and more efficient in which humans do not need to intervene in the driving process when there is an unexpected driving condition such as a pedestrian walking across the street when it is not supposed to.[46]
This drone is an example of UAV that could be used to locate a missing person in the mountain for example.
Unmanned aerial vehicles (UAV) and unmanned underwater vehicles (UUV) have the potential to assist search and rescue work in wilderness areas, such as locating a missing person remotely from the evidence that they left in surrounding areas.[47][48] The system integrates autonomy and information, such as coverage maps, GPS information and quality search video, to support humans performing the search and rescue work efficiently in the given limited time.[47][48]
The project "Moonwalk" is aimed to simulate the manned mission to Mars and to test the robot-astronaut cooperation in an analogue environment.
Humans have been working on achieving the next breakthrough in space exploration, such as a crewed mission to Mars.[49] This challenge identified the need for developing planetary rovers that are able to assist astronauts and support their operations during their mission.[49] The collaboration between rovers, UAVs, and humans enables leveraging capabilities from all sides and optimizes task performance.[49]
Agricultural robots
[edit]
Human labor has been greatly used in agriculture but Agricultural robots like milking robots have been adopted in large-scale farming. Hygiene is the main issue in the agri-food sector and the invention of this technology has widely impacted agriculture. Robots can also be used in tasks that might be hazardous to human health like in the application of chemicals to plants.[50]
Anthropomorphism and the uncanny valley
Bartneck and Okada[51] suggest that a robotic user interface can be described by the following four properties:
Tool – toy scale
- Is the system designed to solve a problem effectively or is it just for entertainment?
Remote control – autonomous scale
- Does the robot require remote control or is it capable of action without direct human influence?
Reactive – dialogue scale
- Does the robot rely on a fixed interaction pattern or is it able to have dialogue — exchange of information — with a human?
Anthropomorphism scale
- Does it have the shape or properties of a human?
ACE – International Conference on Future Applications of AI, Sensors, and Robotics in Society
[edit]
The International Conference on Future Applications of AI, Sensors, and Robotics in Society explore the state of the art research, highlighting the future challenges as well as the hidden potential behind the technologies. The accepted contributions to this conference will be published annually in the special edition of the Journal of Future Robot Life.
The International Conference on Social Robotics is a conference for scientists, researchers, and practitioners to report and discuss the latest progress of their forefront research and findings in social robotics, as well as interactions with human beings and integration into our society.
- ICSR2009, Incheon, Korea in collaboration with the FIRA RoboWorld Congress
- ICSR2010, Singapore
- ICSR2011, Amsterdam, Netherlands
International Conference on Human–Robot Personal Relationships
[edit]
- HRPR2008, Maastricht
- HRPR 2009, Tilburg. Keynote speaker was Hiroshi Ishiguro.
- HRPR2010, Leiden. Keynote speaker was Kerstin Dautenhahn.
International Congress on Love and Sex with Robots
[edit]
The International Congress on Love and Sex with Robots is an annual congress that invites and encourages a broad range of topics, such as AI, Philosophy, Ethics, Sociology, Engineering, Computer Science, Bioethics.
The earliest academic papers on the subject were presented at the 2006 E.C. Euron Roboethics Atelier, organized by the School of Robotics in Genoa, followed a year later by the first book – "Love and Sex with Robots" – published by Harper Collins in New York. Since that initial flurry of academic activity in this field the subject has grown significantly in breadth and worldwide interest. Three conferences on Human–Robot Personal Relationships were held in the Netherlands during the period 2008–2010, in each case the proceedings were published by respected academic publishers, including Springer-Verlag. After a gap until 2014 the conferences were renamed as the "International Congress on Love and Sex with Robots", which have previously taken place at the University of Madeira in 2014; in London in 2016 and 2017; and in Brussels in 2019. Additionally, the Springer-Verlag "International Journal of Social Robotics", had, by 2016, published articles mentioning the subject, and an open access journal called "Lovotics" was launched in 2012, devoted entirely to the subject. The past few years have also witnessed a strong upsurge of interest by way of increased coverage of the subject in the print media, TV documentaries and feature films, as well as within the academic community.
The International Congress on Love and Sex with Robots provides an excellent opportunity for academics and industry professionals to present and discuss their innovative work and ideas in an academic symposium.
- 2020, Berlin, Germany
- 2019, Brussels, Belgium
- 2017, London, United Kingdom
- 2016, London, United Kingdom
- 2014, Madeira, Portugal
International Symposium on New Frontiers in Human–Robot Interaction
[edit]
This symposium is organized in collaboration with the Annual Convention of the Society for the Study of Artificial Intelligence and Simulation of Behaviour.
- 2015, Canterbury, United Kingdom
- 2014, London, United Kingdom
- 2010, Leicester, United Kingdom
- 2009, Edinburgh, United Kingdom
IEEE International Symposium in Robot and Human Interactive Communication
[edit]
The IEEE International Symposium on Robot and Human Interactive Communication ( RO-MAN ) was founded in 1992 by Profs. Toshio Fukuda, Hisato Kobayashi, Hiroshi Harashima and Fumio Hara. Early workshop participants were mostly Japanese, and the first seven workshops were held in Japan. Since 1999, workshops have been held in Europe and the United States as well as Japan, and participation has been of international scope.
ACM/IEEE International Conference on Human–Robot Interaction
[edit]
This conference is amongst the best conferences in the field of HRI and has a very selective reviewing process. The average acceptance rate is 26% and the average attendance is 187. Around 65% of the contributions to the conference come from the US and the high level of quality of the submissions to the conference becomes visible by the average of 10 citations that the HRI papers attracted so far.[52]
- HRI 2006 in Salt Lake City, Utah, USA, Acceptance Rate: 0.29
- HRI 2007 in Washington, D.C., USA, Acceptance Rate: 0.23
- HRI 2008 in Amsterdam, Netherlands, Acceptance Rate: 0.36 (0.18 for oral presentations)
- HRI 2009 in San Diego, CA, USA, Acceptance Rate: 0.19
- HRI 2010 in Osaka, Japan, Acceptance Rate: 0.21
- HRI 2011 in Lausanne, Switzerland, Acceptance Rate: 0.22 for full papers
- HRI 2012 in Boston, Massachusetts, USA, Acceptance Rate: 0.25 for full papers
- HRI 2013 in Tokyo, Japan, Acceptance Rate: 0.24 for full papers
- HRI 2014 in Bielefeld, Germany, Acceptance Rate: 0.24 for full papers
- HRI 2015 in Portland, Oregon, USA, Acceptance Rate: 0.25 for full papers
- HRI 2016 in Christchurch, New Zealand, Acceptance Rate: 0.25 for full papers
- HRI 2017 in Vienna, Austria, Acceptance Rate: 0.24 for full papers
- HRI 2018 in Chicago, USA, Acceptance Rate: 0.24 for full papers
- HRI 2021 in Boulder, USA, Acceptance Rate: 0.23 for full papers
International Conference on Human–Agent Interaction
[edit]
- HAI 2013 in Sapporo, Japan
- HAI 2014 in Tsukuba, Japan
- HAI 2015 in Daegu, Korea
- HAI 2016 in Singapore
- HAI 2017 in Bielefeld, Germany
There are many conferences that are not exclusively HRI, but deal with broad aspects of HRI, and often have HRI papers presented.
- IEEE-RAS/RSJ International Conference on Humanoid Robots (Humanoids)
- Ubiquitous Computing (UbiComp)
- IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS)
- Intelligent User Interfaces (IUI)
- Computer Human Interaction (CHI)
- American Association for Artificial Intelligence (AAAI)
- INTERACT
There are currently two dedicated HRI Journals
- ACM Transactions on Human–Robot Interaction (Originally Journal of Human–Robot Interaction)
- International Journal of Social Robotics
and there are several more general journals in which one will find HRI articles.
- International Journal of Humanoid Robotics
- ‘Entertainment robotics’ section of the Entertainment Computing Journal
- Interaction Studies Journal
- Artificial Intelligence
- Systems, Man and Cybernetics
There are several books available that specialise on Human–Robot Interaction. While there are several edited books, only a few dedicated texts are available:
- Bartneck, C.; Belpaeme, T.; Eyssel, F.; Kanda, T.; Keijsers, M.; Šabanović, S. (2019). Human–Robot Interaction - an introduction. Cambridge U.P.[53] – free PDF available online[54]
- Kanda, T.; Ishiguro, H. (2012). Human–Robot Interaction in Social Robotics. CRC Press.[55]
- Breazeal, C.; Dautenhahn, K.; Kanda, T. (2016). "Social Robotics". Springer Handbook of Robotics. pp. 1935–1972. – chapter in an extensive handbook.[56]
Many universities offer courses in Human–Robot Interaction.
University Courses and Degrees
[edit]
- Tufts University, Medford, MA, USA, MS and PhD programs in Human–Robot Interaction
- University of Waterloo, Canada, Kerstin Dautenhahn, Social Robotics – Foundations, Technology and Applications of Human-Centered Robotics
- National Taipei University in Taiwan, Taiwan, Hooman Samani, M5226 Advanced Robotics
- Ontario Tech University, Canada, Patrick C. K. Hung, BUSI4590U Topics in Technology Management & INFR 4599U Service Robots Innovation for Commerce
- The Colorado School of Mines, USA, Tom Williams, CSCI 436 / 536: Human–Robot Interaction
- Heriot-Watt University, UK, Lynne Baillie, F21HR Human Robot Interaction
- Uppsala University, Sweden, Filip Malmberg, UU-61611 Social Robotics and Human–Robot Interaction
- Skövde University, Sweden, MSc Human–Robot Interaction program
- Indiana University, Bloomington, USA, Selma Sabanovic, INFO-I 440 Human–Robot Interaction
- Ghent University, Belgium, Tony Belpaeme, E019370A Robotics module
- Bielefeld University, Germany, Frederike Eyssel, 270037 Sozialpsychologische Aspekte der Mensch-Maschine Interaktion
- Kyoto University, Japan, Takayuki Kanda, 3218000 Human–Robot Interaction (ヒューマンロボットインタラクション)
- KTH Royal Institute of Technology, Sweden, Iolanda Leite, DD2413 Social Robotics
- Chalmers University of Technology, Sweden, Mohammad Obaid, DAT545 Human-Robot Interaction Design
Online Courses and Degrees
[edit]
There are also online courses available such as Mooc:
- University of Canterbury (UCx) – edX program
- ^ a b Kalinowska, Aleksandra; Pilarski, Patrick M.; Murphey, Todd D. (3 May 2023). "Embodied Communication: How Robots and People Communicate Through Physical Interaction". Annual Review of Control, Robotics, and Autonomous Systems. 6 (1): 205–232. doi:10.1146/annurev-control-070122-102501. ISSN 2573-5144. S2CID 255701603.
- ^ Asimov, Isaac (1950). "Runaround". I, Robot. The Isaac Asimov Collection. New York: Doubleday. p. 40. ISBN 978-0-385-42304-5. This is an exact transcription of the laws. They also appear in the front of the book, and in both places there is no "to" in the 2nd law. Note that this snippet has been copy-pasted from Three Laws of Robotics
- ^ Hornbeck, Dan (2008-08-21). "Safety in Automation". machinedesign.com. Retrieved 2020-06-12.
- ^ Scholtz, Jean. "Evaluation methods for human-system performance of intelligent systems". Proceedings of the 2002 Performance Metrics for Intelligent Systems (PerMIS) Workshop. doi:10.1007/s10514-006-9016-5. S2CID 31481065.
- ^ Kahn, Peter H.; Ishiguro, Hiroshi; Friedman, Batya; Kanda, Takayuki (2006-09-08). What is a human? - Toward psychological benchmarks in the field of human–robot interaction. ROMAN 2006 - the 15th IEEE International Symposium on Robot and Human Interactive Communication. pp. 364–371. doi:10.1109/ROMAN.2006.314461. ISBN 1-4244-0564-5. S2CID 10368589.
- ^ "Meet Paro, the therapeutic robot seal". CNN. November 20, 2003. Retrieved 2020-06-12.
- ^ "The future of human–robot interaction". as.cornell.edu. 27 September 2017. Retrieved 2020-06-12.
- ^ "3: Emergence of HRI as a Field". Human–Robot Interaction. Retrieved 2020-06-12.
- ^ Dautenhahn, Kerstin (29 April 2007). "Socially intelligent robots: Dimensions of human–robot interaction". Philosophical Transactions of the Royal Society B: Biological Sciences. 362 (1480): 679–704. doi:10.1098/rstb.2006.2004. PMC 2346526. PMID 17301026.
- ^ Sauppé, Allison; Mutlu, Bilge (2015). Proceedings of the 33rd Annual ACM Conference on Human Factors in Computing Systems. ACM Conference on Human Factors in Computing Systems - CHI '15. pp. 3613–3622. doi:10.1145/2702123.2702181. ISBN 978-1-4503-3145-6. S2CID 3136657.
- ^ Human–Robot Interaction – via www.interaction-design.org.
- ^ Bubaš, Goran; Lovrenčić, Alen (2002). "Implications of interpersonal communication competence research on the design of artificial behavioral systems that interact with humans". Proceedings of the 6th International Conference on Intelligent Engineering Systems. International Conference on Intelligent Engineering Systems - INES 2002.
- ^ Collobert, Ronan; Weston, Jason; Bottou, Léon; Karlen, Michael; Kavukcuoglu, Koray; Kuksa, Pavel (2011). Natural Language Processing (Almost) from Scratch. OCLC 963993063.
- ^ Mathur, Maya B.; Reichling, David B. (2016). "Navigating a social world with robot partners: S quantitative cartography of the Uncanny Valley". Cognition. 146: 22–32. doi:10.1016/j.cognition.2015.09.008. PMID 26402646.
- ^
- ^ Sakamoto, Daisuke; Kanda, Takayuki; Ono, Tetsuo; Ishiguro, Hiroshi; Hagita, Norihiro (2007). "Android as a telecommunication medium with a human-like presence". Proceeding of the ACM/IEEE International Conference on Human–Robot Interaction. ACM/IEEE International Conference on Human–Robot Interaction - HRI '07. p. 193. doi:10.1145/1228716.1228743. ISBN 978-1-59593-617-2. S2CID 1093338.
- ^ Spence, Patric R.; Westerman, David; Edwards, Chad; Edwards, Autumn (July 2014). "Welcoming Our Robot Overlords: Initial Expectations About Interaction With a Robot". Communication Research Reports. 31 (3): 272–280. doi:10.1080/08824096.2014.924337. S2CID 144545474.
- ^ Edwards, Chad; Edwards, Autumn; Spence, Patric R.; Westerman, David (21 December 2015). "Initial Interaction Expectations with Robots: Testing the Human-To-Human Interaction Script". Communication Studies. 67 (2): 227–238. doi:10.1080/10510974.2015.1121899. S2CID 146204935.
- ^ Banks, Jaime (2021-01-28). "Of like mind: The (mostly) similar mentalizing of robots and humans". Technology, Mind, and Behavior. 1 (2). doi:10.1037/tmb0000025.
- ^ Pettigrew, T.F.; Tropp, L.R. (2006). "A meta-analytic test of intergroup contact theory". Journal of Personality and Social Psychology. 90 (5): 751–783. doi:10.1037/0022-3514.90.5.751. PMID 16737372. S2CID 14149856.
- ^ Haggadone, Brad A.; Banks, Jaime; Koban, Kevin (2021-04-07). "Of robots and robotkind: Extending intergroup contact theory to social machines". Communication Research Reports. 38 (3): 161–171. doi:10.1080/08824096.2021.1909551. S2CID 233566369.
- ^ Wullenkord, Ricarda; Fraune, Marlena R.; Eyssel, Friederike; Šabanović, Selma (August 2016). Getting in touch: How imagined, actual, and physical contact affect evaluations of robots. 25th IEEE International Symposium on Robot and Human Interactive Communication (RO-MAN 2016). pp. 980–985. doi:10.1109/ROMAN.2016.7745228.
- ^ Anticipatory robot control for efficient human–robot collaboration (pdf). HRI '16: The Eleventh ACM/IEEE International Conference on Human–Robot Interaction. 2016. pp. 83–90. ISBN 9781467383707.
- ^ Roy, Someshwar; Edan, Yael (2018-03-27). "Investigating Joint-Action in Short-Cycle Repetitive Handover Tasks: The Role of Giver Versus Receiver and its Implications for Human–Robot Collaborative System Design". International Journal of Social Robotics. 12 (5): 973–988. doi:10.1007/s12369-017-0424-9. ISSN 1875-4805. S2CID 149855145.
- ^ "Coordination mechanisms in human–robot collaboration". Proceedings of the 2013 ACM/IEEE International Conference on Human–Robot Interaction. ACM/IEEE International Conference on Human–Robot Interaction - HRI 2013. 2013. CiteSeerX 10.1.1.478.3634.
- ^ Sebanz, Natalie; Bekkering, Harold; Knoblich, Günther (February 2006). "Joint action: bodies and minds moving together". Trends in Cognitive Sciences. 10 (2): 70–76. doi:10.1016/j.tics.2005.12.009. hdl:2066/55284. PMID 16406326. S2CID 1781023.
- ^ Huang, Chien-Ming; Cakmak, Maya; Mutlu, Bilge (2015). Adaptive coordination strategies for human–robot handovers (PDF). Robotics: Science and Systems.
- ^ Someshwar, Roy; Edan, Yael (2017-08-30). "Givers & receivers perceive handover tasks differently: Implications for human–robot collaborative system design". arXiv:1708.06207 [cs.HC].
- ^ "WeBuild: Automatically distributing assembly tasks among collocated workers to improve coordination". CHI '17: Proceedings of the 2017 CHI Conference on Human Factors in Computing Systems. Association for Computing Machinery. 2017. doi:10.1145/3025453.3026036.
- ^ Bartneck, Christoph; Yogeeswaran, Kumar; Ser, Qi Min; Woodward, Graeme; Sparrow, Robert; Wang, Siheng; Eyssel, Friederike (26 February 2018). "Robots And Racism". Proceedings of the 2018 ACM/IEEE International Conference on Human-Robot Interaction. pp. 196–204. doi:10.1145/3171221.3171260. ISBN 978-1-4503-4953-6.
- ^ Paterson, Mark (26 January 2024). "Why are so many robots white?". The Conversation. Retrieved 8 June 2024.
- ^ Allan, Caroline Klein,David (1 August 2019). "Robot racism? Yes, says a study showing humans' biases extend to robots". CNN. Retrieved 8 June 2024.
{{[cite news](/wiki/Template:Cite%5Fnews "Template:Cite news")}}
: CS1 maint: multiple names: authors list (link) - ^ Samuel, Sigal (2 August 2019). "Humans keep directing abuse — even racism — at robots". Vox. Retrieved 8 June 2024.
- ^ Ackerman, Evan (July 17, 2018). "Humans Show Racial Bias Towards Robots of Different Colors: Study - IEEE Spectrum". IEEE. Retrieved 8 June 2024.
- ^ a b c Hentout, Abdelfetah; Aouache, Mustapha; Maoudj, Abderraouf; Akli, Isma (2019-08-18). "Human–robot interaction in industrial collaborative robotics: A literature review of the decade 2008–2017". Advanced Robotics. 33 (15–16): 764–799. doi:10.1080/01691864.2019.1636714. ISSN 0169-1864. S2CID 198488518.
- ^ Rega, Andrea; Di Marino, Castrese; Pasquariello, Agnese; Vitolo, Ferdinando; Patalano, Stanislao; Zanella, Alessandro; Lanzotti, Antonio (20 December 2021). "Collaborative Workplace Design: A Knowledge-Based Approach to Promote Human–Robot Collaboration and Multi-Objective Layout Optimization". Applied Sciences. 11 (24): 12147. doi:10.3390/app112412147. hdl:10446/285754.
- ^ a b Aggogeri, Francesco; Mikolajczyk, Tadeusz; O'Kane, James (April 2019). "Robotics for rehabilitation of hand movement in stroke survivors". Advances in Mechanical Engineering. 11 (4): 168781401984192. doi:10.1177/1687814019841921. ISSN 1687-8140.
- ^ a b Oña, Edwin Daniel; Garcia-Haro, Juan Miguel; Jardón, Alberto; Balaguer, Carlos (2019-06-26). "Robotics in health care: Perspectives of robot-aided interventions in clinical practice for rehabilitation of upper limbs". Applied Sciences. 9 (13): 2586. doi:10.3390/app9132586. hdl:10016/34029. ISSN 2076-3417.
- ^ a b Robinson, Hayley; MacDonald, Bruce; Broadbent, Elizabeth (November 2014). "The role of healthcare robots for older people at Home: A review". International Journal of Social Robotics. 6 (4): 575–591. doi:10.1007/s12369-014-0242-2. ISSN 1875-4791. S2CID 25075532.
- ^ Carros, Felix; Meurer, Johanna; Löffler, Diana; Unbehaun, David; Matthies, Sarah; Koch, Inga; Wieching, Rainer; Randall, Dave; Hassenzahl, Marc; Wulf, Volker (21 April 2020). "Exploring Human-Robot Interaction with the Elderly: Results from a Ten-Week Case Study in a Care Home". Proceedings of the 2020 CHI Conference on Human Factors in Computing Systems. pp. 1–12. doi:10.1145/3313831.3376402. ISBN 9781450367080. S2CID 218483496.
- ^ a b Carros, Felix; Schwaninger, Isabel; Preussner, Adrian; Randall, Dave; Wieching, Rainer; Fitzpatrick, Geraldine; Wulf, Volker (May 2022). "Care Workers Making Use of Robots: Results of a Three-Month Study on Human-Robot Interaction within a Care Home". CHI Conference on Human Factors in Computing Systems. pp. 1–15. doi:10.1145/3491102.3517435. ISBN 9781450391573. S2CID 248419908.
- ^ Curtis, Sophie (2017-07-28). "This creepy-looking humanoid robot has a very important purpose". The Mirror. Retrieved 2019-10-28.
- ^ a b c d e Begum, Momotaz; Serna, Richard W.; Yanco, Holly A. (April 2016). "Are robots ready to deliver autism interventions? A comprehensive review". International Journal of Social Robotics. 8 (2): 157–181. doi:10.1007/s12369-016-0346-y. ISSN 1875-4791. S2CID 15396137.
- ^ Belpaeme, Tony; Kennedy, James; Ramachandran, Aditi; Scassellati, Brian; Tanaka, Fumihide (2018-08-22). "Social robots for education: A review". Science Robotics. 3 (21): eaat5954. doi:10.1126/scirobotics.aat5954. ISSN 2470-9476. PMID 33141719. S2CID 52033756.
- ^ Catlin, Dave (2014), Cao, Yiwei; Väljataga, Terje; Tang, Jeff K.T.; Leung, Howard (eds.), "Using Peer Assessment with Educational Robots", New Horizons in Web Based Learning, Lecture Notes in Computer Science, vol. 8699, Cham: Springer International Publishing, pp. 57–65, doi:10.1007/978-3-319-13296-9_6, ISBN 978-3-319-13295-2, retrieved 2023-03-01
- ^ a b Biondi, Francesco; Alvarez, Ignacio; Jeong, Kyeong-Ah (2019-07-03). "Human–Vehicle Cooperation in Automated Driving: A Multidisciplinary Review and Appraisal". International Journal of Human–Computer Interaction. 35 (11): 932–946. doi:10.1080/10447318.2018.1561792. ISSN 1044-7318. S2CID 86447168.
- ^ a b Goodrich, M. A.; Lin, L.; Morse, B. S. (May 2012). "Using camera-equipped mini-UAVS to support collaborative wilderness search and rescue teams". 2012 International Conference on Collaboration Technologies and Systems (CTS). p. 638. doi:10.1109/CTS.2012.6261008. ISBN 978-1-4673-1382-7. S2CID 13164847.
- ^ a b Morse, Bryan S.; Engh, Cameron H.; Goodrich, Michael A. (2010). "UAV video coverage quality maps and prioritized indexing for wilderness search and rescue". Proceeding of the 5th ACM/IEEE international conference on Human-robot interaction - HRI '10 (PDF). Osaka, Japan: ACM Press. pp. 227–234. doi:10.1145/1734454.1734548. ISBN 9781424448937. S2CID 11511362.
- ^ a b c Bernard, Tiziano; Martusevich, Kirill; Rolins, Armando A.; Spence, Isaac; Troshchenko, Alexander; Chintalapati, Sunil (2018-09-17). A novel Mars rover concept for astronaut operational support on surface EVA missions. 2018 AIAA SPACE and Astronautics Forum and Exposition. Orlando, FL: American Institute of Aeronautics and Astronautics. doi:10.2514/6.2018-5154. ISBN 9781624105753.
- ^ "Emerging Technology for Application in the Agri-Food Sector - SIPMM Publications". publication.sipmm.edu.sg. 2019-01-01. Retrieved 2022-11-15.
- ^ Bartneck, Christoph; Michio Okada (2001). "Robotic User Interfaces" (PDF). Proceedings of the Human and Computer Conference. pp. 130–140.
- ^ Bartneck, Christoph (February 2011). "The end of the beginning: a reflection on the first five years of the HRI conference". Scientometrics. 86 (2): 487–504. doi:10.1007/s11192-010-0281-x. PMC 3016230. PMID 21297856.
- ^ Bartneck, Christoph; Belpaeme, Tony; Eyssel, Friederike; Kanda, Takayuki; Keijsers, Merel; Šabanović, Selma (2019). Human–Robot Interaction - An Introduction. Cambridge, UK: Cambridge University Press. ISBN 9781108735407. Retrieved 27 January 2020.
- ^ "Human Robot Interaction – Share and enjoy!".
- ^ Kanda, Takayuki (2012). Human–Robot Interaction in Social Robotics. Boca Raton, FL: CRC Press. ISBN 9781466506978.
- ^ Breazeal, Cynthia; Dautenhahn, Kerstin; Takayuki, Kanda (2016). "Social Robotics". In Siciliano, Bruno; Khatib, Oussama (eds.). Springer Handbook of Robotics. Berlin: Springer. pp. 1935–1972. ISBN 9783319325507.
- ^ "Professional Certificate in Human–Robot Interaction". edX. Canterbury, UK: University of Canterbury (UCx). 2021-09-01. Retrieved 2021-09-01.
- ^ "Introduction to Human–Robot Interaction". edX. Canterbury, UK: University of Canterbury (UCx). 2021-09-01. Retrieved 2021-09-01.
- ^ "Methods and Application in Human–Robot Interaction". edX. Canterbury, UK: University of Canterbury (UCx). 2021-09-01. Retrieved 2021-09-01.
- "Human interaction with the robot J2B2". hakenberg.de. Algorithms, graphics, and video material.
- Hottelet, Ulrich (June 2009). "Albert is not happy – How robots learn to live with people". African Times. Archived from the original on 2012-01-12.