A system for generating research impact visualizations over medical research groups (original) (raw)
Related papers
Visualizing the Scholarly Impact of Medical Education Researchers
2019
The Carle Illinois College of Medicine is a new professional school with an engineering-focused approach to the medical education of Physician Innovators. Over 200 faculty representing diverse disciplines from across the Illinois campus were identified to provide instruction, curricular support, and mentorship for student-driven research. Representing academic departments in STEM, medicine, health and life sciences, this trans-disciplinary approach to medical education creates unique opportunities for innovation. While processes to review the academic performance of students are continually being iterated upon, another aspect of assessing the research efficacy of a college is to track the scholarly impact of affiliated faculty. Librarian information professionals embedded in academic libraries have developed expertise in the understanding, use and application of research metrics. This information has typically been provided in narrative form, non-interactive graphs, or has required ...
Research Impact Indicators and Visualizations
2019
Research Impact Metrics • They are used to measure the productivity, influence, and performance of individual faculty, research groups, departments, college, and universities and to compare individuals and units with other individuals and units
Measuring and Visualizing Research Collaboration and Productivity
Journal of Data and Information Science, 2018
Purpose: This paper presents findings of a quasi-experimental assessment to gauge the research productivity and degree of interdisciplinarity of research center outputs. Of special interest, we share an enriched visualization of research co-authoring patterns. Design/methodology/approach: We compile publications by 45 researchers in each of 1) the iUTAH project, which we consider here to be analogous to a "research center," 2) CG1-a comparison group of participants in two other Utah environmental research centers, and 3) CG2-a comparison group of Utah university environmental researchers not associated with a research center. We draw bibliometric data from Web of Science and from Google Scholar. We gather publications for a period before iUTAH had been established (2010-2012) and a period after (2014-2016). We compare these research outputs in terms of publications and citations thereto. We also measure interdisciplinarity using Integration scoring and generate science overlay maps to locate the research publications across disciplines. Findings: We find that participation in the iUTAH project appears to increase research outputs (publications in the After period) and increase research citation rates relative to the comparison group researchers (although CG1 research remains most cited, as it was in the Before period). Most notably, participation in iUTAH markedly increases co-authoring among researchers-in general; and for junior, as well as senior, faculty; for men and women: across organizations; and across disciplines. Research limitations: The quasi-experimental design necessarily generates suggestive, not definitively causal, findings because of the imperfect controls. Practical implications: This study demonstrates a viable approach for research assessment of a center or program for which random assignment of control groups is not possible. It illustrates use of bibliometric indicators to inform R&D program management. Originality/value: New visualizations of researcher collaboration provide compelling comparisons of the extent and nature of social networking among target cohorts.
Science and Public Policy, 2012
The US National Institutes of Health (NIH) supports basic and applied biomedical research by funding grants and contracts. To measure the outcomes and impact of their programs, NIH staff conduct program evaluations and undertake targeted investigations of research portfolios. Recently, the NIH launched the electronic scientific portfolio assistant (eSPA), a web-based analytics system based on linked scientific databases that provides quantitative information for program officers and planning and evaluation officials managing research portfolios. This system has improved the ability to create and collaboratively refine research portfolios, has reduced the time needed to collect and link outcomes data such as publications and patents, and is providing information used to support research management decisions. After describing the eSPA system, we provide examples of three eSPA evaluation projects that illustrate the impact of this system on NIH evaluation objectives.
How to Document Scientific and Clinical Impact of Research: Six Steps to Success
Perspectives of the ASHA Special Interest Groups
Purpose: The purpose of this article was to provide a brief tutorial about impact metrics and how to use these metrics to document scientific impact. In addition, examples are provided that describe possible ways to document clinical impact of research. Method: We briefly introduce traditional bibliometrics for journals (e.g., impact factor), articles (e.g., citation counts), and authors (e.g., h index). We describe alternative metrics (i.e., altmetrics) that focus on other types of dissemination metrics such as usage (e.g., downloads, reads, and views), engagement (e.g., comments, shares, and replies), and attention (e.g., Altmetric Attention Score). We also discuss how these metrics are used by others to make decisions about employment, tenure and promotion, funding, and the like. We detail six steps to documenting the scientific and clinical impact of your research. Steps include establishing an ORCID (Open Researcher and Contributor ID) account, creating research profiles on aca...
2022
In recent years, United States federal funding agencies, including the National Institutes of Health (NIH) and the National Science Foundation (NSF), have implemented public access policies to make research supported by funding from these federal agencies freely available to the public. Enforcement is primarily through annual and final reports submitted to these funding agencies, where all peer-reviewed publications must be registered through the appropriate mechanism as required by the specific federal funding agency. Unreported and/or incorrectly reported papers can result in delayed acceptance of annual and final reports and even funding delays for current and new research grants. So, it's important to make sure every peer-reviewed publication is reported properly and in a timely manner. For large collaborative research efforts, the tracking and proper registration of peer-reviewed publications along with generation of accurate annual and final reports can create a large administrative burden. With large collaborative teams, it is easy for these administrative tasks to be overlooked, forgotten, or lost in the shuffle. In order to help with this reporting burden, we have developed the Academic Tracker software package, implemented in the Python 3 programming language and supporting Linux, Windows, and Mac operating systems. Academic Tracker helps with publication tracking and reporting by comprehensively searching major peer-reviewed publication tracking web portals, including PubMed, Crossref, ORCID, and Google Scholar, given a list of authors. Academic Tracker provides highly customizable reporting templates so information about the resulting publications is easily transformed into appropriate formats for tracking and reporting purposes. The source code and extensive documentation is hosted on GitHub (https://moseleybioinformaticslab.github.io/academic\_tracker/) and is .
SD^2: Slicing and Dicing Scholarly Data for Interactive Evaluation of Academic Performance
IEEE Transactions on Visualization and Computer Graphics, 2022
Comprehensively evaluating and comparing researchers' academic performance is complicated due to the intrinsic complexity of scholarly data. Different scholarly evaluation tasks often require the publication and citation data to be investigated in various manners. In this paper, we present an interactive visualization framework, SD 2 , to enable flexible data partition and composition to support various analysis requirements within a single system. SD 2 features the hierarchical histogram, a novel visual representation for flexibly slicing and dicing the data, allowing different aspects of scholarly performance to be studied and compared. We also leverage the state-of-the-art set visualization technique to select individual researchers or combine multiple scholars for comprehensive visual comparison. We conduct multiple rounds of expert evaluation to study the effectiveness and usability of SD 2 and revise the design and system implementation accordingly. The effectiveness of SD 2 is demonstrated via multiple usage scenarios with each aiming to answer a specific, commonly raised question.
Labman: A Research Information System to Foster Insight Discovery Through Visualizations
Communications in Computer and Information Science, 2015
Effective handling of research related data is an ambitious goal, as many data entities need to be suitably designed in order to model the distinctive features of different knowledge areas: publications, projects, people, events and so on. A well designed information architecture prevents errors due to data redundancy, outdated records or poor provenance, allowing both internal staff and third parties reuse the information produced by the research centre. Moreover, making the data available through a public, Internet accessible portal increases the visibility of the institution, fostering new collaborations with external centres.However, the lack of a common structure when describing research data might prevent non-expert users from using these data. Thus we present labman, a web-based information research system that connects all the actors in the research landscape in an interoperable manner, using metadata and semantic descriptions to enrich the stored data. Labman presents different visualizations to allow data exploration and discovery in an interactive fashion, relying on humans' visual capacity rather than an extensive knowledge on the research field itself. Thanks to the visual representations, visitors can quickly understand the performance of experts, project outcomes, publication trajectory and so forth.
Bibliometric Visualization and Analysis Software: State of the Art, Workflows, and Best Practices
2019
Despite the demonstrated value of visualization-based modalities for measuring and mapping science, it remains common practice to search and explore the literature via databases that present lists of articles with little, if any, supplementary visual information. Identifying the desired item in a list is a familiar information retrieval paradigm with a low cognitive load. However, given the rapid emergence of the field of visual text analytics, it is time to challenge the notion that article lists should remain the dominant method to search and organize the scientific literature. One reason that visualization methods are applied relatively rarely in information retrieval may be that it is difficult to develop useful and user-friendly science mapping systems. This article summarizes key workflows for bibliometric mapping, a technique for visually representing information from scientific publications, including citation data, bibliographic metadata, and article content. It describes m...
Visualization of the citation impact environments of scientific journals: An online mapping exercise
Journal of the American Society for Information …, 2007
of the Science Citation Index (5968 journals) and the Social Science Citation Index (1712 journals) are made accessible from the perspective of any of these journals. A vectorspace model is used for normalization, and the results are brought online at http://www.leydesdorff.net/jcr04 as input-files for the visualization program Pajek. The user is thus able to analyze the citation environment in terms of links and graphs.