Journalism In An Era Of Big Data (original) (raw)
Related papers
From Computer-Assisted to Data-Driven: Journalism and Big Data
THIS IS AN AUTHORS’ DRAFT. THE FINAL VERSION OF THIS ARTICLE WAS PUBLISHED IN JOURNALISM, ONLINE FIRST: 15 DECEMBER 2015 (DOI: 1464884915620205) HTTP://JOU.SAGEPUB.COM/CONTENT/EARLY/2015/12/15/1464884915620205.ABSTRACT Despite claims of continuity, contemporary data journalism is quite different from the earlier tradition of computer-assisted reporting. Although it echoes earlier claims about being scientific and democratic, these qualities are understood as resulting from better data access rather than as being something achieved by the journalist. In the context of Big Data in particular, human subjectivity tends to be downgraded in importance, even understood as getting in the way if it means hubristically theorising about causation rather than working with correlation and allowing the data to speak. Increasing ‘datafication’ is not what is driving changes in the profession, however. Rather, the impact of Big Data tends to be understood in ways that are consonant with pre-existing expectations, which are shaped by the broader contemporary post-humanist political context. The same is true in academic analysis, where actor–network theory seems to be emerging as the dominant paradigm for understanding data journalism, but in largely uncritical ways.
Big Data and Journalism: How American Journalism is Adopting the Use of Big Data
Novum Jus
This research uses in-depth interviews with three data journalists from the Houston Chronicle and the New York Times in the United States to describe the role of data journalists, and to illustrate how and why they use big data in their stories. Data journalists possess a unique set of skills including being able to find data, gather data, and use that data to tell a compelling story in a written and visually coherent way. Results show that as newspapers move to a digital format the role of a data journalist is becoming more essential as is the importance of laws such as the Freedom of Information Act to enable journalists to request and use data to continue to inform the public and hold those in power accountable.
The Mediated data model of Communication Flow: Big Data and Data Journalism
KOME: An International Journal of Pure Communication Inquiry, 2018
In recent decades, journalism has undergone considerable transformation, initially fuelled by the digitalization of journalistic work flows and subsequently by the introduction of the Internet, its services, and its effects. Since contemporary journalists employ multiple digital tools and services to gather, administrate, and process information for public consumption, new types/genres of journalism have emerged. Among these, data journalism is one of the most prominent, introduced due to the availability of data in digital form and also to the abundance of efficient online tools that help users analyze, visualize, and publish large amounts of data. Indeed, it is not only the journalistic profession that has changed, but the communication process itself, which has been fundamentally altered to meet the public's current needs and demands. This paper introduces and examines the mediated data model of communication flow to describe these new norms in the mass communication process. Using big data as a case study and moving on to data journalism, we provide a theoretical overview of the model, employing the theory of the two-step flow of communication as a starting point, while attempting to shed light on the current communication process between journalists/media and their initial sources of information.
Communication & Society, 2018
The use of methods of the social sciences and computational tools to analyze databases in journalism has had several definitions since Philip Meyer called it precision journalism (PJ). In the last decade, this specialty has had an important development under the term data journalism (DJ), in a differentiating technological and sociocultural environment: Big Data. This research aims to differentiate DJ from PJ and computer assisted reporting (CAR) with a perspective taken from the science and technology studies, focusing the news as a boundary object between programmers, designers, journalists and other actors that now are part of the news production process. For this purpose, 14 in-depth interviews have been made from 2015 to 2017 to data journalists from Spain (8), EEUU (1) and Finland (1); PP, PD and transparency academic experts from Spain (1) and Finland (2); and one expert in transparency acts y access to public information in Spain, Europe and Latin American. As a result, it can be affirmed that big data is differentiating element of DJ because it is a sociocultural context where the open data philosophy, free software, collaborative and team work are part of its identity.
Data Journalism Beyond Technological Determinism
Journalism Studies, 2021
Innovation in journalism became an important element to determine the current and future direction of the profession. Through incremental and cumulative transformations over time, because of many obstacles faced inside the newsrooms, journalism has suffered from significant and fundamental changes, including the deployment of data journalism. In Latin America, the practice has seen an increasing expansion in the last years. Nevertheless, there are important technological gaps that limit its development. The present study draws upon literature on data journalism, media management, and sociology, aiming to contribute theoretically to data journalism research. Our findings show that beyond the technological approach, practitioners are relying on data evangelists, collaboration, and audience-centered innovation to produce data storytelling in their newsrooms. On the other hand, these alliances form “homophily” and “endogamy” features that limit the dissemination of the practice, which must consider the potential implications for the social distance of the audiences. It argues that Latin American professionals are distancing themselves from technological determinism to embrace a more audience-centric innovation in newsrooms. Finally, it also states that it is important to take into account those limitations, as they pose obstacles for data journalism innovation research knowledge. The article concludes with an agenda for future research.
The development of data journalism, and technology paternalism as a challenge
The University of Gothenburg, 2023
This text explores the development of data journalism and highlights technology paternalism as a challenge within this field. It begins with a quote by Holovaty, emphasizing the importance of questioning whether data journalism is indeed journalism. The text covers the past, present, and potential future of data journalism, emphasizing its role in progressing journalism and fostering public trust. The incorporation of data in storytelling and the need for data skills in journalistic production are discussed. The text then delves into the challenges, particularly the issue of technology paternalism. Technology paternalism involves pre-programmed rules without active audience approval, and the role of journalism is significant in this context. The discussion presents arguments about how data journalism can both reduce and contribute to technology paternalism. Examples, such as the New York Times' investigative report on facial recognition and China Daily's coverage of Xinjiang terror attacks, illustrate these points. A case study on the evolution of data journalism in Australian newsrooms is presented, highlighting a shift toward data normalization and simplification. The conclusion touches on the growing use of data and computational tools in newsrooms, sparking discussions about the future of data journalism and its integration into general journalistic techniques.
What is Data Journalism For? Cash, Clicks, and Cut and Trys
The Data Journalism Handbook (2nd Edition), 2018
Permalink: https://datajournalismhandbook.org/handbook/two/reflections/what-is-data-journalism-for-cash-clicks-and-cut-and-trys This critique I offer flows largely from a Western-centered perspective, if not-US centered perch, but that does not undermine the essential call to action I put forward: data journalists are still sitting on a potentially revolutionary toolbox for journalism that has yet to be unleashed. The revolution, however, if executed poorly, only stands to further undermine both the user-experience and knowledge-seeking efforts of news consumers, and at worst, further seed distrust in news. If data journalism just continues to look like it has looked for the past five to ten years, then data journalism does little to advance the cause of journalism in the digital and platform era. Thus, to start asking this existential question about “What is data journalism for?” I propose, that data journalists, along with less-data focused but web-immersed journalists who work in video, audio, and code, as well as the scholars that poke and prod them, need to rethink data journalism’s origin story, its present rationale, and its future.
As an increasing number of reporters see databases and algorithms as appropriate means of doing investigation, journalism has been challenged in recent years by the following question: to what extent would the processing of huge datasets allow journalists to produce new types of revelations that rely less on normative assumptions? Drawing on the analysis of a particular investigation by the San Francisco-based Center for Investigative Reporting, this article points out the existence of epistemological tensions in the making of journalistic revelations that involve the processing of vast amounts of data. First, I show that the design of data-processing artifacts can match the traditional epistemology of journalistic investigation, but only with great efforts and resources from the organization. Second, I point out that the use of these artifacts by journalists follows two opposite paths to produce the revelation: a “hypothesis-driven” path and a “data-driven” path. Such findings contribute to a better understanding of how news organizations produce justified beliefs, as data-processing artifacts become major components of the newsroom’s environment.
Imagining big data: Illustrations of “big data” in US news articles, 2010–2016
New Media & Society, 2018
Imagining “big data” brings up a palette of concerns about their technological intricacies, political significance, commercial value, and cultural impact. We look at this emerging arena of public sense-making and consider the spectrum of press illustrations that are employed to show what big data are and what their consequences could be. We collected all images from big data-related articles published in the online editions of The New York Times and The Washington Post. As the first examination of the visual dimension of big data news reports to date, our study suggests that big data are predominantly illustrated with reference to their areas of application and the people and materials involved in data analytics. As such, they provide concrete physical form to abstract data. Rather than conceiving of potential ramifications that are more or less likely to materialize, the dominant mode of illustration draws on existing, though often trite, visual evidence.