Network Engineering Using Autonomous Agents Increases Cooperation in Human Groups - PubMed (original) (raw)
Network Engineering Using Autonomous Agents Increases Cooperation in Human Groups
Hirokazu Shirado et al. iScience. 2020.
Abstract
Cooperation in human groups is challenging, and various mechanisms are required to sustain it, although it nevertheless usually decays over time. Here, we perform theoretically informed experiments involving networks of humans (1,024 subjects in 64 networks) playing a public-goods game to which we sometimes added autonomous agents (bots) programmed to use only local knowledge. We show that cooperation can not only be stabilized, but even promoted, when the bots intervene in the partner selections made by the humans, re-shaping social connections locally within a larger group. Cooperation rates increased from 60.4% at baseline to 79.4% at the end. This network-intervention strategy outperformed other strategies, such as adding bots playing tit-for-tat. We also confirm that even a single bot can foster cooperation in human groups by using a mixed strategy designed to support the development of cooperative clusters. Simple artificial intelligence can increase the cooperation of groups.
Keywords: Behavioral Neuroscience; Cognitive Neuroscience; Collaborative Computing; Human-Computer Interaction.
Copyright © 2020 The Authors. Published by Elsevier Inc. All rights reserved.
Conflict of interest statement
Declaration of Interests The authors declare no competing interests.
Figures
Graphical abstract
Figure 1
The Fraction of Cooperative Human Players per Round Light gray lines show results for each session, black lines show average across all experimental sessions for each treatment (N session = 8 per treatment). Initial rates of average cooperation varied by chance across treatments (the dashed lines). See the results of two other control conditions (“always cooperate” and “tit-for-tat”) in Figure S1. Across all 48 groups, the average initial rate of cooperation was 68.2% ± 12.8%.
Figure 2
Average Change in Rates of Cooperation by Round Estimates based on GLMM, using a logistic regression model of individual cooperation choice with random effects for session and individuals (see Transparent Methods). The error bars are 95% confidence interval (CI).
Figure 3
Cooperation Pattern with Neighborhood Change (A) Cell color shows cooperation probabilities estimated by GLMM shown in Table S2. (B) The impact of possible changes in a subject's surroundings is shown schematically. From a particular point in the parameter space (as shown in A), a person could move sideways or along the diagonal in the probabilistic space of human cooperation. Detaching or attaching to a defector changes the number of neighbors but does not change the number of cooperative neighbors; this direction obliquely crosses the contour of the cooperation probability distribution. On the other hand, detaching or attaching to a cooperator changes both the number of neighbors and the number of cooperative neighbors; such a change runs almost parallel to the contour of the cooperation probability distribution.
Figure 4
Cooperation and Network Dynamics with a Single Bot Deploying a Mixed Strategy (A) Control diagram for how a single bot intervenes in a network of human subjects. (B) Experiment results regarding average cooperation fraction with 95% CI (N session = 8 for each treatment). The orange line indicates the result of sessions with the single network-engineering bot. The dark gray line indicates the result of sessions without bots, which is identical to Figure 1A. (C) Experiment results regarding average rate of the bot's intervention strategy actually applied to human players, over the rounds. (D) Network snapshots of an example session having a single bot and a session without bots.
Similar articles
- Enhancing social cohesion with cooperative bots in societies of greedy, mobile individuals.
Shi L, He Z, Shen C, Tanimoto J. Shi L, et al. PNAS Nexus. 2024 Jun 5;3(6):pgae223. doi: 10.1093/pnasnexus/pgae223. eCollection 2024 Jun. PNAS Nexus. 2024. PMID: 38881842 Free PMC article. - Stingy bots can improve human welfare in experimental sharing networks.
Shirado H, Hou YT, Jung MF. Shirado H, et al. Sci Rep. 2023 Oct 20;13(1):17957. doi: 10.1038/s41598-023-44883-0. Sci Rep. 2023. PMID: 37864003 Free PMC article. - Locally noisy autonomous agents improve global human coordination in network experiments.
Shirado H, Christakis NA. Shirado H, et al. Nature. 2017 May 17;545(7654):370-374. doi: 10.1038/nature22332. Nature. 2017. PMID: 28516927 Free PMC article. - Oxytocin and vasopressin modulation of prisoner's dilemma strategies.
Neto ML, Antunes M, Lopes M, Ferreira D, Rilling J, Prata D. Neto ML, et al. J Psychopharmacol. 2020 Aug;34(8):891-900. doi: 10.1177/0269881120913145. Epub 2020 Mar 24. J Psychopharmacol. 2020. PMID: 32207359 Free PMC article. Clinical Trial. - Cooperative responses in rats playing a 2 × 2 game: Effects of opponent strategy, payoff, and oxytocin.
Donovan A, Ryan E, Wood RI. Donovan A, et al. Psychoneuroendocrinology. 2020 Nov;121:104803. doi: 10.1016/j.psyneuen.2020.104803. Epub 2020 Aug 2. Psychoneuroendocrinology. 2020. PMID: 32755813 Free PMC article.
Cited by
- Simple autonomous agents can enhance creative semantic discovery by human groups.
Ueshima A, Jones MI, Christakis NA. Ueshima A, et al. Nat Commun. 2024 Jun 18;15(1):5212. doi: 10.1038/s41467-024-49528-y. Nat Commun. 2024. PMID: 38890368 Free PMC article. - Small bots, big impact: solving the conundrum of cooperation in optional Prisoner's Dilemma game through simple strategies.
Sharma G, Guo H, Shen C, Tanimoto J. Sharma G, et al. J R Soc Interface. 2023 Jul;20(204):20230301. doi: 10.1098/rsif.2023.0301. Epub 2023 Jul 19. J R Soc Interface. 2023. PMID: 37464799 Free PMC article. - A new sociology of humans and machines.
Tsvetkova M, Yasseri T, Pescetelli N, Werner T. Tsvetkova M, et al. Nat Hum Behav. 2024 Oct;8(10):1864-1876. doi: 10.1038/s41562-024-02001-8. Epub 2024 Oct 22. Nat Hum Behav. 2024. PMID: 39438685 Review. - Facilitating cooperation in human-agent hybrid populations through autonomous agents.
Guo H, Shen C, Hu S, Xing J, Tao P, Shi Y, Wang Z. Guo H, et al. iScience. 2023 Oct 12;26(11):108179. doi: 10.1016/j.isci.2023.108179. eCollection 2023 Nov 17. iScience. 2023. PMID: 37920671 Free PMC article. - Scaffolding cooperation in human groups with deep reinforcement learning.
McKee KR, Tacchetti A, Bakker MA, Balaguer J, Campbell-Gillingham L, Everett R, Botvinick M. McKee KR, et al. Nat Hum Behav. 2023 Oct;7(10):1787-1796. doi: 10.1038/s41562-023-01686-7. Epub 2023 Sep 7. Nat Hum Behav. 2023. PMID: 37679439 Free PMC article.
References
- Allen B., Lippner G., Chen Y.-T., Fotouhi B., Momeni N., Yau S.-T., Nowak M.A. Evolutionary dynamics on any population structure. Nature. 2017;544:227–230. - PubMed
- Axelrod R. Basic Books; 1984. The Evolution of Cooperation.
- Centola D., Macy M. Complex contagions and the weakness of long ties. Am. J. Sociol. 2007;113:702–734.
LinkOut - more resources
Full Text Sources