An Agent-Based Model To Understand Tradeoffs In Online Community Design (original) (raw)
Related papers
In this article, we advocate a new approach in theory development by translating and synthesizing insights from multiple social science theories in an agent-based model to understand challenges in building online communities. To demonstrate the utility of this approach, we use it to examine the effects of three types of discussion moderation in conversation-based communities: no moderation, in which all members are exposed to all messages, community-level moderation, in which off-topic messages are deleted for everyone in the group, and personalized moderation, in which people see different messages based on their interests. Our results suggest that personalized moderation outperforms the others in increasing members’ contribution and commitment, especially in topically broad communities and those with high message volume. In comparison, community-level moderation increases commitment but not contribution. Our results also reveal a critical trade-off between informational and relational benefits. This research demonstrates the value of agent-based modeling in synthesizing more narrowly-focused theories to describe and prescribe behaviors in a complex system, to generate novel theoretical insights that were out of scope for the component theories, and to use these insights to inform the design of online communities.
Agent-Based Modelling for Online Community Designers
2017
Online community designers have to make difficult design decisions with unclear outcomes. Using an agent-based model by Ren and Kraut, the behavior can be predicted. This paper adds parameters such as newcomer’s rate to the model to understand how to increase the activity level of members and make the underlying networks visible. Using the new interface, the model shows that small communities tend to have a larger percentage of contributors. Author
Social Control in Online Communities: A Framework for Community Management
Online communities of consumption (OCCs) represent highly diverse groups of consumers whose interests are not always aligned. Social control in OCCs aims to effectively manage problems arising from this heterogeneity. Extant literature on social control in OCCs is fragmented as some studies focus on the principles of social control, while others focus on the implementation. Moreover, the domain is undertheorized. This article integrates the disparate literature on social control in OCCs providing a first unified conceptualization of the topic. The authors conceptualize social control as a system, or configuration, of moderation practices. Moderation practices are executed during interactions operating under different governance structures (market, hierarchy, and clan) and serving different purposes (interaction initiation, maintenance, and termination). From this conceptualization, important areas of future research emerge and research questions are developed. The framework also serves as a community management tool for OCC managers, enabling the diagnosis of social control problems and the elaboration of strategies and tactics to address them.
Research issues in the design of online communities: report on the CHI 99 workshop
1999
24 October 1999 Volume 31, Number 4 SIGCHI Bulletin various social networks, building trust and disseminating one's reputation. The second thread of the discussion had to do with the strong tension between the individual and the organization that becomes particularly acute in online community. Who is allowed to'speak'? Who owns the information produced? Who sets the rules about what may and may not be discussed?
Analyzing the Effectiveness of an Extensible Virtual Moderator
Proceedings of the ACM on Human-Computer Interaction, 2022
The problems associated with open-ended group discussion are well-documented in sociology research. We seek to alleviate these issues using technology that autonomously serves as a discussion moderator. Building on top of an extensible framework called Diplomat, we develop a "conversational agent", ArbiterBot to promote efficiency, fairness, and professionalism in otherwise unstructured discussions. To evaluate the effectiveness of this agent, we recruited university students to participate in a study involving a series of prompted discussions over the Slack messenger app. The results of this study suggest that the conversational agent is effective at balancing contributions across participants, encouraging a timely consensus and promoting a higher coverage of topics. We believe that the results motivate further investigation into how conversational agents can be used to improve group discussion and cooperation.