Characterization and prediction of Wikipedia edit wars (original) (raw)

Edit wars in Wikipedia

2011

Abstract We present a new, efficient method for automatically detecting severe conflicts,edit wars' in Wikipedia and evaluate this method on six different language Wikipedias. We discuss how the number of edits and reverts deviate in such pages from those following the general workflow, and argue that earlier work has significantly over-estimated the contentiousness of the Wikipedia editing process.

Dynamics of Edit War Sequences in Wikipedia

2020

In any collaborative system, cooperation and conflicts exist together. While in some cases these conflicts improve the output, they also lead to increased overhead. This requires examining the dynamics of these conflicts with the help of underlying data. In Wikipedia articles, the conflicts are captured by edit wars which may be examined through the revision history of these articles. In this work, we perform a systematic analysis of the conflicts present in 1,208 controversial articles of Wikipedia captured in the form of edit war sequences. We examine various key characteristics of these sequences and further use them to estimate the outcome of the edit wars. The study indicates the possibility of devising automated coordination mechanisms for handling conflicts in collaborative spaces.

USING N-GRAMS TO IDENTIFY EDIT WARS ON WIKIPEDIA

2019 IEEE Fifth International Conference on Multimedia Big Data (BigMM), Singapore, Singapore, 2019, 2019

This paper presents the method of identifying Wikipedia edit wars using N-grams analysis. The analysis is conducted on the corpus of past versions of Wikipedia pages concerning historical figures who are glorified and idolised by the Hindu Right. The analysis shows that Wikipedia's open structure and Article Policies enable a conversation between academic and popular histories, a feat which has been difficult in India in the past.

The dynamic nature of conflict in Wikipedia

EPL (Europhysics Letters), 2014

The voluntary process of Wikipedia edition provides an environment where the outcome is clearly a collective product of interactions involving a large number of people. We propose a simple agent-based model, developed from real data, to reproduce the collaborative process of Wikipedia edition. With a small number of simple ingredients, our model mimics several interesting features of real human behaviour, namely in the context of edit wars. We show that the level of conflict is determined by a tolerance parameter, which measures the editors' capability to accept different opinions and to change their own opinion. We propose to measure conflict with a parameter based on mutual reverts, which increases only in contentious situations. Using this parameter, we find a distribution for the inter-peace periods that is heavy-tailed. The effects of wiki-robots in the conflict levels and in the edition patterns are also studied. Our findings are compared with previous parameters used to measure conflicts in edit wars.

Talk Before You Type: Coordination in Wikipedia

2007 40th Annual Hawaii International Conference on System Sciences (HICSS'07), 2007

Wikipedia, the online encyclopedia, has attracted attention both because of its popularity and its unconventional policy of letting anyone on the internet edit its articles. This paper describes the results of an empirical analysis of Wikipedia and discusses ways in which the Wikipedia community has evolved as it has grown. We contrast our findings with an earlier study [11] and present three main results. First, the community maintains a strong resilience to malicious editing, despite tremendous growth and high traffic. Second, the fastest growing areas of Wikipedia are devoted to coordination and organization. Finally, we focus on a particular set of pages used to coordinate work, the "Talk" pages. By manually coding the content of a subset of these pages, we find that these pages serve many purposes, notably supporting strategic planning of edits and enforcement of standard guidelines and conventions. Our results suggest that despite the potential for anarchy, the Wikipedia community places a strong emphasis on group coordination, policy, and process.

iChase: Supporting exploration and awareness of editing activities on wikipedia

2010

Abstract To increase its credibility and preserve the trust of its readers. Wikipedia needs to ensure a good quality of its articles. To that end, it is critical for Wikipedia administrators to be aware of contributors' editing activity to monitor vandalism, encourage reliable contributors to work on specific articles, or find mentors for new contributors. In this paper, we present iChase, a novel interactive visualization tool to provide administrators with better awareness of editing activities on Wikipedia.

A Characterization of Wikipedia Content Based on Motifs in the Edit Graph

2011

Abstract Wikipedia works because of the many eyes idea. Good Wikipedia pages are authoritative sources because a number of knowledgeable contributors have collaborated to produce an authoritative article on a topic. In this paper we explore the hypothesis that the extent to which the many eyes idea is true for a specific article can be assessed by looking at the edit graph associated with that article, ie the network of contributors and articles.

The illiterate editor: metadata-driven revert detection in Wikipedia

As the community depends more heavily on Wikipedia as a source of reliable information, the ability to quickly detect and remove detrimental information becomes increasingly important. The longer incorrect or malicious information lingers in a source perceived as reputable, the more likely that information will be accepted as correct and the greater the loss to source reputation. We present The Illiterate Editor (IllEdit), a content-agnostic, metadata-driven classification approach to Wikipedia revert detection. Our primary contribution is in building a metadata-based feature set for detecting edit quality, which is then fed into a Support Vector Machine for edit classification. By analyzing edit histories, the IllEdit system builds a profile of user behavior, estimates expertise and spheres of knowledge, and determines whether or not a given edit is likely to be eventually reverted. The success of the system in revert detection (0.844 F-measure) as well as its disjoint feature set as compared to existing, content-analyzing vandalism detection systems, shows promise in the synergistic usage of IllEdit for increasing the reliability of community information.

Contropedia - the analysis and visualization of controversies in Wikipedia articles

Proceedings of The International Symposium on Open Collaboration - OpenSym '14, 2014

Collaborative content creation inevitably reaches situations where di↵erent points of view lead to conflict. In Wikipedia, one of the most prominent examples of collaboration online, conflict is mediated by both policy and software, and conflicts often reflect larger societal debates.