James A Hodges | San Jose State University (original) (raw)
Publications by James A Hodges
Information & Culture, 2023
This article presents a taxonomy of the information practices apparent in an imageboard discussio... more This article presents a taxonomy of the information practices apparent
in an imageboard discussion thread that was influential in jump-starting the worldwide QAnon movement. After introducing QAnon with a review
of literature, the author examines 4Chan /pol/ thread #147547939 (key in introducing multiple key elements of the QAnon narrative) to enumerate and classify the information practices deployed by discussion participants. In conclusion, the article expands beyond existing research’s previous focus on outright fabrication, showing that early QAnon participants’ information practices are also defined in large part by suspicious and idiosyncratic modes of read- ing authentic sources, not simply the propagation of falsehoods.
Journal of Documentation, 2023
Purpose-This article aims to advance a multifaceted framework for preserving algorithms and algor... more Purpose-This article aims to advance a multifaceted framework for preserving algorithms and algorithmic systems in an archival context. Design/methodology/approach-The article is based on a review and synthesis of existing literature, during which the authors observe emergent themes. After introducing these themes, the authors follow each theme as manifest in existing digital preservation projects, starting with algorithms' earliest conceptual starting points and moving up through themes' eventual implementation within a complex social environment. Findings-The authors find current literature is largely divided between that which addresses algorithms primarily as computational artifacts and that which views them instead as primarily social in nature. To bridge this gap the authors propose that "the algorithm," as the algorithm is frequently deployed in popular discourse, is best understood as not as either the algorithm's technical or social components, but rather the sum total of both. Research limitations/implications-The study is limited by its methodology as a literature review. However, the findings point toward a new framing for future research that is less divided in terms of social or material orientation. Practical implications-Creating multifaceted records of algorithms, the authors argue, enables more effective regulation and management of algorithmic systems, which in turn help to improve their levels of fairness, accountability, and trustworthiness. Originality/value-The paper offers a wide variety of case studies with the potential to inform future studies, while contextualizing the studies together within a new framework that avoids prior limitations.
SAGE Research Methods: Doing Research Online , 2022
How to Collect a Corpus of Websites With a Web Crawler Introduction The internet is a vast and ev... more How to Collect a Corpus of Websites With a Web Crawler Introduction The internet is a vast and ever-changing resource, constantly adding new content and removing or replacing old content. This dynamic quality presents several interesting challenges to the researcher. How can we cite or examine a web resource with any authority, when we know that it might look very different to new visitors in the future? Web crawling is one way of tackling this issue. Web crawlers are a type of software that traverses content on the web to map Learning Outcomes By the end of this guide, readers should be able to: • Explain and justify the value of web data and crawling online sources for social science research • Assess and reflect on the ethical challenges of using website data for research • Understand the techniques, tools, and methodological challenges and limitations of collecting and archiving websites with a web crawler • Execute research using the aforementioned tools and techniques SAGE
Reading Home Cultures Through Books, 2022
Historical precedents Public and private reading Reading is not, and never has been, a homogenous... more Historical precedents Public and private reading Reading is not, and never has been, a homogenous or universal set of practices. This is as true during the digital era today as it was for prior eras.
Proceedings of the Association for Information Science and Technology, 2021
This paper introduces a mixed-methods approach for forensically reconstructing the propagation of... more This paper introduces a mixed-methods approach for forensically reconstructing the propagation of visual media via networked digital devices. The authors present case studies drawn from political misinformation around the January 6, 2021 riots at the U.S. Capitol. Using interpretive analysis, the authors identify traces of user interfaces that remain in images being shared about the riots. Using computational analysis, the authors evaluate compression levels in digital photographs of the events in question, thus identifying which instances of the image are closer to the source (as well as which images appear to be identical). By combining these two approaches, the authors argue that SMOC BRISQUEt refines our understanding of misinformation's memetic spread-helpful in curbing future abuses as well as in guiding the production of more effective cross-platform spread when desired.
Journal of Documentation, 2021
Purpose – This paper aims to introduce new criteria for evaluating authenticity in digital preser... more Purpose – This paper aims to introduce new criteria for evaluating authenticity in digital preservation, particularly in cases related to unreleased software projects and preservation work that occurs in non-institutional settings.
Design/methodology/approach – Interpretive visual and formal analysis of image files is performed on three overlapping preservation efforts to understand the ways that self-appointed preservationists reframe content in varied settings. The unreleased mid-1990s console game Sonic X-Treme is used as a case study because assets from the development process have been widely preserved among former developers and enthusiasts alike.
Findings – The findings indicate that non-professional preservationists transcode original production files into a variety of formats, ranging from lossy compressed images to contemporary three-dimensional (3D) modeling files. Materials are presented in settings that range from colorful webpages mimicking the appearance of commercial software to browsable file systems. These results show that non-institutional preservation practices embody notions of authenticity that diverge significantly from those of professional archivists.
Research limitations/implications – The study is limited by its focus on a single case study, but helps to facilitate ongoing research concerning preservation of unreleased projects insofar as it surveys the current status of existing projects.
Originality/value – Existing studies within preservation literature have established the need for increased attention paid to unfinished digital works. This study introduces new data and interpretative findings that outline such preservation efforts as they already occur in non-institutional settings.
Information & Culture, 2021
Journal of the Association for Information Science and Technology, 2021
This study examines the documents circulated among biomedical equipment repair technicians in ord... more This study examines the documents circulated among biomedical equipment repair technicians in order to build a conceptual model that accounts for mul-tilayered temporality in technical healthcare professional communities. A metadata analysis informed by digital forensics and trace ethnography is employed to model the overlapping temporal, format-related, and annotation characteristics present in a corpus of repair manual files crowdsourced during collaborations between volunteer archivists and professional technicians. The corpus originates within iFixit.com's Medical Device Repair collection, a trove of more than 10,000 manuals contributed by working technicians in response to the strain placed on their colleagues and institutions due to the COVID-19 pandemic. The study focuses in particular on the Respiratory Analyzer sub-category of documents, which aid in the maintenance of equipment central to the care of COVID-19 patients experiencing respiratory symptoms. The 40 Respiratory Analyzer manuals in iFixit's collection are examined in terms of their original publication date, the apparent status of their original paper copies, the version of PDF used to encode them, and any additional metadata that is present. Based on these characteristics, the study advances a conceptual model accounting for circulation among multiple technicians, as well as alteration of documents during the course of their lifespans.
Internet Histories, 2020
This article examines early digital archival practices, especially those related to historical so... more This article examines early digital archival practices, especially those related to historical sources digitised and published to the World Wide Web in the 1990s. Without well-documented professional standards for the digitisation and publication of archival materials online during this period, many archival workers developed innovative, yet idiosyncratic methods of arranging and presenting archival material. Using historical methods informed by digital forensics, this article reconstructs the development practices of one such group of archival workers. The article is structured around a case study examining digitised archival materials pulled from the personal records of American psychologist Timothy Leary, published to Leary.com in the mid-1990s. Forensic analysis of the interface and contents of Leary.com is used to ascertain the dates of development, as well as the specific techniques employed. Next, analysis of the archival arrangement bestowed upon the Web site contents is compared against the professional guidelines generally followed by American archivists, highlighting key differences between the ad-hoc practices of non-institutional archivists and the more formalized procedures followed by peers at established institutions. In conclusion, the article argues that this case study is valuable insofar as it establishes both methodological and historical precedents for deeper engagement with primary sources in Internet history research.
Information Research, 2019
This paper presents bibliographical archaeology as a method for comparing unique characteristics ... more This paper presents bibliographical archaeology as a method for comparing unique characteristics among many copies of the same computer software program. The process is demonstrated using celebrity psychologist Timothy Leary's Mind Mirror software as a case study. After retrieving a suitable corpus, data are examined for patterns, and emergent patterns are interpreted using historical inference. This approach builds on Dalbello-Lovric's case for bibliographical archaeology, expanding it to include new consideration for the unique qualities of born-digital artefacts. Each artefact is classified according to publication date and presence of supporting documentation, before introducing historical sources for additional context. Variations among copies of the same software artefact are found to proliferate well after the objects' initial date of publication. Bibliographical archaeology succeeds in highlighting and contextualising features of an artefact that were previously overlooked. Findings support a media-archaeological view of born-digital artefacts, and bibliographical archaeology is shown to provide a programmatic approach in identifying significant archaeological characteristics among artefacts that have yet to be exhaustively studied.
New Media & Society, 2017
Preserving a historically significant video game frequently requires either preserving or adaptin... more Preserving a historically significant video game frequently requires either preserving or adapting a touchable interface for contemporary use. While control techniques are often evaluated in terms of fidelity between in- and out-of-game actions, this essay emphasizes several ways that fidelity must be actively constructed. Bringing a haptic perspective on video gaming into conversation with game history and preservation, this essay examines ways that textual materials surrounding and supplementing a work can be used to construct haptic fidelity. The video game Q*bert is selected as a case study both because of its historical and cultural significance and because it makes idiosyncratic use of controller and force-feedback technologies. The essay concludes that playing Q*bert in a preservation setting requires several unique accommodations at the level of touchable interface, and each accommodation illustrates another way that supplementary texts help construct historical haptic fidelity.
MALware Technical Reports, 2016
Research concerning the cultural, historical, and technical significance of software and computer... more Research concerning the cultural, historical, and technical significance of software and computer-related artifacts presents several unique methodological challenges. This technical report uses Timothy Leary's Mind Mirror (Electronic Arts, 1985/1986) as a case study in ways that new users may begin to identify physical, textual, and bibliographic features unique to a given software artifact. Specific features investigated relate to the many computing environments, hardware/software configurations, copying techniques, and supplementary documentation comprising Mind Mirror today. The methods used to investigate these features are presented not only as approaches to analyzing Mind Mirror , but also as ways to approach historical software in general. These methods include software execution across a variety of platforms, as well as textual analysis of supporting documents and electronic file analysis using hex editors.
Media Fields, 2014
Video arcades are dying everywhere, incompatible with contemporary regimes of commercial space. T... more Video arcades are dying everywhere, incompatible with contemporary
regimes of commercial space. There was a time when video arcades
were considered a threat to the fabric of urban space, and entrepreneurial operators were accused of luring youth into truancy and addiction. Today, it would appear that these moral objections have been eclipsed by simple economic facts. And yet, arcade spaces have not disappeared completely. This paper examines historical sources related to both arcades themselves and the cybernetic theories popular in postwar America, arguing that dominant 20th-century economic systems incorporated and modulated arcade gaming to fit within prevailing circuits of exchange, treating the arcades' controversial qualities like negative feedback in a cybernetic loop.
Analog Game Studies, vol. 2, 2017
Currency (2014), by Chris Wille and Brian Patrick Franklin, uses an antique Burroughs adding mach... more Currency (2014), by Chris Wille and Brian Patrick Franklin, uses an antique Burroughs adding machine to control player movement on a constantly- printing roll of receipt paper. Players traverse levels generated by real-time Bitcoin exchange rates. Developer Chris Wille describes the project as “an ASCII generated 8 BIT game”1 modeled after Atari’s classic Defender (1981). But Currency throws players into a map generated by live financial information, not by some developer’s omnipotent hand. Everything about Currency challenges our commonsense classification of games as either “digital” or “analog,” from its paper display to its antique controller.
100 Greatest Video Game Franchises, 2017
Ecco the Dolphin (1992) is a home console video game for the Sega Genesis which embodies themes o... more Ecco the Dolphin (1992) is a home console video game for the Sega Genesis which embodies themes of interspecies communication and existential horror. This brief essay contextualizes the game within a broader genealogy of American research concerning the psychology of environmental isolation.
Papers by James A Hodges
Proceedings of the Association for Information Science and Technology, 2021
This paper introduces a mixed‐methods approach for forensically reconstructing the propagation of... more This paper introduces a mixed‐methods approach for forensically reconstructing the propagation of visual media via networked digital devices. The authors present case studies drawn from political misinformation around the January 6, 2021 riots at the U.S. Capitol. Using interpretive analysis, the authors identify traces of user interfaces that remain in images being shared about the riots. Using computational analysis, the authors evaluate compression levels in digital photographs of the events in question, thus identifying which instances of the image are closer to the source (as well as which images appear to be identical). By combining these two approaches, the authors argue that SMOC BRISQUEt refines our understanding of misinformation's memetic spread—helpful in curbing future abuses as well as in guiding the production of more effective cross‐platform spread when desired.
Information & Culture, 2023
This article presents a taxonomy of the information practices apparent in an imageboard discussio... more This article presents a taxonomy of the information practices apparent
in an imageboard discussion thread that was influential in jump-starting the worldwide QAnon movement. After introducing QAnon with a review
of literature, the author examines 4Chan /pol/ thread #147547939 (key in introducing multiple key elements of the QAnon narrative) to enumerate and classify the information practices deployed by discussion participants. In conclusion, the article expands beyond existing research’s previous focus on outright fabrication, showing that early QAnon participants’ information practices are also defined in large part by suspicious and idiosyncratic modes of read- ing authentic sources, not simply the propagation of falsehoods.
Journal of Documentation, 2023
Purpose-This article aims to advance a multifaceted framework for preserving algorithms and algor... more Purpose-This article aims to advance a multifaceted framework for preserving algorithms and algorithmic systems in an archival context. Design/methodology/approach-The article is based on a review and synthesis of existing literature, during which the authors observe emergent themes. After introducing these themes, the authors follow each theme as manifest in existing digital preservation projects, starting with algorithms' earliest conceptual starting points and moving up through themes' eventual implementation within a complex social environment. Findings-The authors find current literature is largely divided between that which addresses algorithms primarily as computational artifacts and that which views them instead as primarily social in nature. To bridge this gap the authors propose that "the algorithm," as the algorithm is frequently deployed in popular discourse, is best understood as not as either the algorithm's technical or social components, but rather the sum total of both. Research limitations/implications-The study is limited by its methodology as a literature review. However, the findings point toward a new framing for future research that is less divided in terms of social or material orientation. Practical implications-Creating multifaceted records of algorithms, the authors argue, enables more effective regulation and management of algorithmic systems, which in turn help to improve their levels of fairness, accountability, and trustworthiness. Originality/value-The paper offers a wide variety of case studies with the potential to inform future studies, while contextualizing the studies together within a new framework that avoids prior limitations.
SAGE Research Methods: Doing Research Online , 2022
How to Collect a Corpus of Websites With a Web Crawler Introduction The internet is a vast and ev... more How to Collect a Corpus of Websites With a Web Crawler Introduction The internet is a vast and ever-changing resource, constantly adding new content and removing or replacing old content. This dynamic quality presents several interesting challenges to the researcher. How can we cite or examine a web resource with any authority, when we know that it might look very different to new visitors in the future? Web crawling is one way of tackling this issue. Web crawlers are a type of software that traverses content on the web to map Learning Outcomes By the end of this guide, readers should be able to: • Explain and justify the value of web data and crawling online sources for social science research • Assess and reflect on the ethical challenges of using website data for research • Understand the techniques, tools, and methodological challenges and limitations of collecting and archiving websites with a web crawler • Execute research using the aforementioned tools and techniques SAGE
Reading Home Cultures Through Books, 2022
Historical precedents Public and private reading Reading is not, and never has been, a homogenous... more Historical precedents Public and private reading Reading is not, and never has been, a homogenous or universal set of practices. This is as true during the digital era today as it was for prior eras.
Proceedings of the Association for Information Science and Technology, 2021
This paper introduces a mixed-methods approach for forensically reconstructing the propagation of... more This paper introduces a mixed-methods approach for forensically reconstructing the propagation of visual media via networked digital devices. The authors present case studies drawn from political misinformation around the January 6, 2021 riots at the U.S. Capitol. Using interpretive analysis, the authors identify traces of user interfaces that remain in images being shared about the riots. Using computational analysis, the authors evaluate compression levels in digital photographs of the events in question, thus identifying which instances of the image are closer to the source (as well as which images appear to be identical). By combining these two approaches, the authors argue that SMOC BRISQUEt refines our understanding of misinformation's memetic spread-helpful in curbing future abuses as well as in guiding the production of more effective cross-platform spread when desired.
Journal of Documentation, 2021
Purpose – This paper aims to introduce new criteria for evaluating authenticity in digital preser... more Purpose – This paper aims to introduce new criteria for evaluating authenticity in digital preservation, particularly in cases related to unreleased software projects and preservation work that occurs in non-institutional settings.
Design/methodology/approach – Interpretive visual and formal analysis of image files is performed on three overlapping preservation efforts to understand the ways that self-appointed preservationists reframe content in varied settings. The unreleased mid-1990s console game Sonic X-Treme is used as a case study because assets from the development process have been widely preserved among former developers and enthusiasts alike.
Findings – The findings indicate that non-professional preservationists transcode original production files into a variety of formats, ranging from lossy compressed images to contemporary three-dimensional (3D) modeling files. Materials are presented in settings that range from colorful webpages mimicking the appearance of commercial software to browsable file systems. These results show that non-institutional preservation practices embody notions of authenticity that diverge significantly from those of professional archivists.
Research limitations/implications – The study is limited by its focus on a single case study, but helps to facilitate ongoing research concerning preservation of unreleased projects insofar as it surveys the current status of existing projects.
Originality/value – Existing studies within preservation literature have established the need for increased attention paid to unfinished digital works. This study introduces new data and interpretative findings that outline such preservation efforts as they already occur in non-institutional settings.
Information & Culture, 2021
Journal of the Association for Information Science and Technology, 2021
This study examines the documents circulated among biomedical equipment repair technicians in ord... more This study examines the documents circulated among biomedical equipment repair technicians in order to build a conceptual model that accounts for mul-tilayered temporality in technical healthcare professional communities. A metadata analysis informed by digital forensics and trace ethnography is employed to model the overlapping temporal, format-related, and annotation characteristics present in a corpus of repair manual files crowdsourced during collaborations between volunteer archivists and professional technicians. The corpus originates within iFixit.com's Medical Device Repair collection, a trove of more than 10,000 manuals contributed by working technicians in response to the strain placed on their colleagues and institutions due to the COVID-19 pandemic. The study focuses in particular on the Respiratory Analyzer sub-category of documents, which aid in the maintenance of equipment central to the care of COVID-19 patients experiencing respiratory symptoms. The 40 Respiratory Analyzer manuals in iFixit's collection are examined in terms of their original publication date, the apparent status of their original paper copies, the version of PDF used to encode them, and any additional metadata that is present. Based on these characteristics, the study advances a conceptual model accounting for circulation among multiple technicians, as well as alteration of documents during the course of their lifespans.
Internet Histories, 2020
This article examines early digital archival practices, especially those related to historical so... more This article examines early digital archival practices, especially those related to historical sources digitised and published to the World Wide Web in the 1990s. Without well-documented professional standards for the digitisation and publication of archival materials online during this period, many archival workers developed innovative, yet idiosyncratic methods of arranging and presenting archival material. Using historical methods informed by digital forensics, this article reconstructs the development practices of one such group of archival workers. The article is structured around a case study examining digitised archival materials pulled from the personal records of American psychologist Timothy Leary, published to Leary.com in the mid-1990s. Forensic analysis of the interface and contents of Leary.com is used to ascertain the dates of development, as well as the specific techniques employed. Next, analysis of the archival arrangement bestowed upon the Web site contents is compared against the professional guidelines generally followed by American archivists, highlighting key differences between the ad-hoc practices of non-institutional archivists and the more formalized procedures followed by peers at established institutions. In conclusion, the article argues that this case study is valuable insofar as it establishes both methodological and historical precedents for deeper engagement with primary sources in Internet history research.
Information Research, 2019
This paper presents bibliographical archaeology as a method for comparing unique characteristics ... more This paper presents bibliographical archaeology as a method for comparing unique characteristics among many copies of the same computer software program. The process is demonstrated using celebrity psychologist Timothy Leary's Mind Mirror software as a case study. After retrieving a suitable corpus, data are examined for patterns, and emergent patterns are interpreted using historical inference. This approach builds on Dalbello-Lovric's case for bibliographical archaeology, expanding it to include new consideration for the unique qualities of born-digital artefacts. Each artefact is classified according to publication date and presence of supporting documentation, before introducing historical sources for additional context. Variations among copies of the same software artefact are found to proliferate well after the objects' initial date of publication. Bibliographical archaeology succeeds in highlighting and contextualising features of an artefact that were previously overlooked. Findings support a media-archaeological view of born-digital artefacts, and bibliographical archaeology is shown to provide a programmatic approach in identifying significant archaeological characteristics among artefacts that have yet to be exhaustively studied.
New Media & Society, 2017
Preserving a historically significant video game frequently requires either preserving or adaptin... more Preserving a historically significant video game frequently requires either preserving or adapting a touchable interface for contemporary use. While control techniques are often evaluated in terms of fidelity between in- and out-of-game actions, this essay emphasizes several ways that fidelity must be actively constructed. Bringing a haptic perspective on video gaming into conversation with game history and preservation, this essay examines ways that textual materials surrounding and supplementing a work can be used to construct haptic fidelity. The video game Q*bert is selected as a case study both because of its historical and cultural significance and because it makes idiosyncratic use of controller and force-feedback technologies. The essay concludes that playing Q*bert in a preservation setting requires several unique accommodations at the level of touchable interface, and each accommodation illustrates another way that supplementary texts help construct historical haptic fidelity.
MALware Technical Reports, 2016
Research concerning the cultural, historical, and technical significance of software and computer... more Research concerning the cultural, historical, and technical significance of software and computer-related artifacts presents several unique methodological challenges. This technical report uses Timothy Leary's Mind Mirror (Electronic Arts, 1985/1986) as a case study in ways that new users may begin to identify physical, textual, and bibliographic features unique to a given software artifact. Specific features investigated relate to the many computing environments, hardware/software configurations, copying techniques, and supplementary documentation comprising Mind Mirror today. The methods used to investigate these features are presented not only as approaches to analyzing Mind Mirror , but also as ways to approach historical software in general. These methods include software execution across a variety of platforms, as well as textual analysis of supporting documents and electronic file analysis using hex editors.
Media Fields, 2014
Video arcades are dying everywhere, incompatible with contemporary regimes of commercial space. T... more Video arcades are dying everywhere, incompatible with contemporary
regimes of commercial space. There was a time when video arcades
were considered a threat to the fabric of urban space, and entrepreneurial operators were accused of luring youth into truancy and addiction. Today, it would appear that these moral objections have been eclipsed by simple economic facts. And yet, arcade spaces have not disappeared completely. This paper examines historical sources related to both arcades themselves and the cybernetic theories popular in postwar America, arguing that dominant 20th-century economic systems incorporated and modulated arcade gaming to fit within prevailing circuits of exchange, treating the arcades' controversial qualities like negative feedback in a cybernetic loop.
Analog Game Studies, vol. 2, 2017
Currency (2014), by Chris Wille and Brian Patrick Franklin, uses an antique Burroughs adding mach... more Currency (2014), by Chris Wille and Brian Patrick Franklin, uses an antique Burroughs adding machine to control player movement on a constantly- printing roll of receipt paper. Players traverse levels generated by real-time Bitcoin exchange rates. Developer Chris Wille describes the project as “an ASCII generated 8 BIT game”1 modeled after Atari’s classic Defender (1981). But Currency throws players into a map generated by live financial information, not by some developer’s omnipotent hand. Everything about Currency challenges our commonsense classification of games as either “digital” or “analog,” from its paper display to its antique controller.
100 Greatest Video Game Franchises, 2017
Ecco the Dolphin (1992) is a home console video game for the Sega Genesis which embodies themes o... more Ecco the Dolphin (1992) is a home console video game for the Sega Genesis which embodies themes of interspecies communication and existential horror. This brief essay contextualizes the game within a broader genealogy of American research concerning the psychology of environmental isolation.
Proceedings of the Association for Information Science and Technology, 2021
This paper introduces a mixed‐methods approach for forensically reconstructing the propagation of... more This paper introduces a mixed‐methods approach for forensically reconstructing the propagation of visual media via networked digital devices. The authors present case studies drawn from political misinformation around the January 6, 2021 riots at the U.S. Capitol. Using interpretive analysis, the authors identify traces of user interfaces that remain in images being shared about the riots. Using computational analysis, the authors evaluate compression levels in digital photographs of the events in question, thus identifying which instances of the image are closer to the source (as well as which images appear to be identical). By combining these two approaches, the authors argue that SMOC BRISQUEt refines our understanding of misinformation's memetic spread—helpful in curbing future abuses as well as in guiding the production of more effective cross‐platform spread when desired.