PDP Site FAIR Metrics
PDP: Info | NPDS | DREAM | FAIR | Design | Diristries | Entities | Privacy
PDP Reports: FAIR Metrics
Reports: 2016-2020 | 2021-2025
Reports 2021-2025
-
Carl Taswell, 2025,
Unfairness by the FAIR Principles Promoters:
Falsifying the Historical Record of Scientific Reports in Knowledge Engineering
versus Maintaining Standards for Objective Truth in Publicly Funded Research
,
also available as
BHA-2025-17
,
presented May 2025 at the ISRES/ARSTE
International Conference on Advances in Technology, Education and Science (ICATES)
in Trabzon Turkey.
Continuing the series of reports on the unfairness by the FAIR Principles promoters, this third chapter reports the scale and scope of their plagiarism from and ghosting of an entire body of published work on the PORTAL-DOORS Project for the Nexus-PORTAL-DOORS-Scribe Cyberinfrastructure for meta-science applications and data interoperability. This project has been available freely open access since 2007 spanning almost 2 decades and more than 5 dozen published research reports including several issued USPTO patents. The persistence of continuing scientific misconduct by the plagiarizing persons in positions of power raises questions about the politicization of science that rejects reason and rational logic. The misconduct by the FAIR Principles promoters has now been demonstrated to be the largest case of plagiarism and fraud in the modern history of science, engineering, and medicine. Quantitative numerical evidence is presented using both citation counts and grant funding amounts. These grants were obtained by the plagiarists with fraudulent applications which failed to cite and discuss the historical record of published literature, failed to disclose conflicts of interest, and falsified applications to the public funding agencies in violation of the rules at those agencies.
-
Carl Taswell, 2025,
Unfairness by the FAIR Principles Promoters: A Case Study on
Misconduct by Complaint Investigators Who Aid and Abet Plagiarists
presented with
slides
January 2025 at the
Hawaii International Conference on System Sciences 2025 (HICSS 58)
in Waikoloa Hawaii, and published in the HICSS 2025 Conference Proceedings as pages 6617-6626 at
hdl.handle.net/10125/109639
in the collection
Combating Abuses of Power in Systems.
Accountability for integrity in research publishing has been abandoned at some journals and universities. Published reports have proven the plagiarism by Wilkinson et al of their FAIR Principles from the PORTAL-DOORS Principles previously published by Taswell almost a decade earlier. Despite the flagrant plagiarism in this Wilkinson case, it has not yet been retracted by the journals involved. Complaints submitted by Taswell to publishers and integrity offices were disregarded or denied, thereby enabling the plagiarists to spread their plagiarism with impunity. The case study reported here details an account of one of these sham investigations. Investigators aided and abetted the plagiarists by imposing a requirement of confidentiality on the complainant, excluding the documentary evidence submitted by the complainant, and engaging in protracted delays that failed to slow the propagating plagiarism. Investigations of plagiarism should be conducted openly with public debate as done for jury trials in courts of law.
-
Adam Craig, Carl Taswell, 2024,
FAIR Metrics for Motivating Ethics in Peer Review
presented December 2024 at
the 16th Workshop on Natural Language Processing and Ontology Engineering (NLPOE2024),
in conjunction with the 23rd IEEE/WIC International Conference on Web Intelligence and Intelligent Agent Technology
(WI-IAT 2024)
in Bangkok, Thailand,
also via DOI
10.1109/WI-IAT62293.2024.00106.
Every year, peer reviewers perform countless hours of uncompensated, anonymous labor in order to maintain the integrity of the scholarly literature. However, the high volume of research output in need of review and the scarcity of experts' time make it difficult to maintain the quality of peer review. We previously introduced Fair Attribution to Indexed Reports (FAIR) Metrics that quantify how well a scholarly work cites and discusses prior literature, how many novel concepts it introduces, and how free it is of plagiarism and misattributions. Unlike lexical plagiarism detection, FAIR Metrics analysis relies on identifying statements with equivalent meanings. Using the FAIR Metrics module of the PDP-DREAM Ontology, we recorded the analyses in searchable, machine-readable FAIR Metrics semantic records. This approach has the potential to strengthen the integrity of scholarly publishing by providing a more transparent and systematic way to trace the origins of ideas. Furthermore, FAIR Metrics analysis can provide a basis for integrated multimedia idea plagiarism detection. Figures and tables often serve as visual abstracts that convey the most important points of a work, making their inclusion necessary for a complete analysis of a paper. Instead of having separate metrics of similarity for comparing prose text, tables, figures, and other forms of media, FAIR Metrics analysis involves extracting the claims that each part of the work is communicating. In the present work, we define new FAIR Metrics for assessing the quality of peer review, extend the FAIR Metrics module of the PDP-DREAM Ontology with the additional classes and properties needed to record FAIR Metrics analysis of a review, and demonstrate usage with three example reviews.
-
Adam Craig, Carl Taswell, 2024,
From Open Review to Reproducible Review: FAIR Metrics Analysis of Open Peer Reviews for Brain Informatics Literature
Brainiacs Journal 2024 Volume 5 Issue 2 Edoc QA6A795A3,
also via DOI
10.48085/QA6A795A3.
Brain informatics helps researchers discover and derive new insights from existing data and metadata in brain sciences, medicine, and healthcare, making the documentation of information methods, platforms, and data sources in scholarly meta-research especially important in this field. Evaluation of new reports by expert peer reviewers remains essential to maintaining the integrity of this published research, but determining the best way to assess the quality of these peer reviews has not been addressed adequately and poses an open question about what should be open peer review. Previously, we proposed the paradigm of reproducible peer review, in which a second reviewer should be able to draw on the same factual claims as the first reviewer in order to reach the same conclusion. We introduced a new family of metrics for peer reviews as an extension of the existing families of Fair Attribution to Indexed Reports (FAIR) Metrics to evaluate how well reviewers attributed the claims substantiating their recommendations to the original sources of that information. However, we only demonstrated this new family of FAIR Metrics on five example peer reviews. We report here the results of FAIR Metrics analyses of published open peer reviews on 14 brain informatics articles. These analyses demonstrate the value of the FAIR Metrics by highlighting ways in which the brain informatics community can improve the reproducibility of the peer review process. We call for open peer review that emphasizes references to or quotes from the specific passages of the work under review, indication of which standards of the publication venue the work meets or ails to meet, and citation of the literature when drawing on prior knowledge of the problem domain.
-
Adam Craig, Carl Taswell, 2024,
FAIR Metrics for Motivating Excellence in Peer Review
presented September 2024 at
eScience 2024
the IEEE 20th International Conference on eScience in Osaka, Japan,
also via DOI
10.1109/e-Science62913.2024.10678726.
Past attempts to measure the quality of peer review have relied on either subjective ratings or tangentially related factors such as the sheer number or length of reviews. Previously, we introduced the Fair Attribution to Indexed Reports (FAIR) Metrics to quantify adherence to good citation practices via systematic semantic comparison of statements in the target document to those found in cited and uncited prior reports. In the present work, we define new FAIR Metrics for assessing the quality of peer review, extend the FAIR Metrics module of the PDP-DREAM Ontology with additional classes and properties needed to record FAIR Metrics analysis of a peer review, and demonstrate use with a simple example.
-
Carl Taswell, 2024,
Biomedical Informatics Needs New Nosology for Collective, Community, Social and Public Health
presented with
slides
July 2024 at the AIME 2024
22nd International Conference on Artificial Intelligence in Medicine (AIME)
workshop on
AI and Precision Medicine: Innovations and Applications
in Salt Lake City, Utah, and published as
Brainiacs Journal 2024 Volume 5 Issue 1 Edoc W3A8E3D23,
also via DOI
10.48085/W3A8E3D23.
Pharmacogenomic molecular imaging of neurodegenerative disorders and dementias has served as the motivating problem in precision medicine guiding software development for the past two decades in the PORTAL-DOORS Project (PDP). This work in data sciences, artificial intelligence, biomedical informatics and translational research with clinical trials at Brain Health Alliance has been pursued to support the mission of advancing theranostics with molecular imaging for disorders of the brain and nervous system. The history and published literature associated with PDP for the NPDS Cyberinfrastructure will be surveyed since its inception in 2006. This collection of published work, involving 5 dozen conference and journal papers over 18 years, has always been publicly available at www.PORTALDOORS.org. This review of PDP will highlight PDP-DREAM Software to support truth in science and integrity in research with a call for a new nosology and new metrics to evaluate and measure collective, community, social, and public health.
-
Carl Taswell, 2024,
Unfairness by the FAIR Principles Promoters:
A Case Study on the Absence of Accountability for Integrity in Research Publishing
presented with
slides
May 2024 at the ISRES/ARSTE
International Conference on Advances in Technology, Education and Science (ICATES)
in Alanya Turkey, and published as
chapter 12 pages 298-325 in book ISBN 978-6256959675
Current Academic Studies in Technology and Education 2024
edited by T. A. Oliveira and M. T. Hebebci.
This survey reviews and analyzes the evidence from the historical record of published literature relevant to the plagiarism by the Wilkinson et al 2016 FAIR Principles of the previously published Taswell 2007 PORTAL-DOORS Principles. The analysis discusses this plagiarism by Wilkinson et al of Taswell's published research within the cultural framework of practices by both for-profit and not-for-profit publishers that should promote ethics in publishing. These publishing ethics must include the distinction between unintentional omission of citation followed by apology and correction versus intentional exclusion of citation followed by authors' idea-laundering plagiarism with authors' false claims of independent development and by editors' idea-bleaching censorship of public open scientific debate. When both plagiarizing authors and censoring editors act complicitly together in citation cartels with willful disregard of the historical record of published literature available in public online and offline libraries and data repositories, then mis-information, dis-information, anti-information, caco-information, and mal-information will continue to pollute and harm the reproducibility, validity, and integrity of medical, scientific, and engineering research.
-
Adam Craig, Anousha Athreya, Carl Taswell, 2023,
Managing Lexical-Semantic Hybrid Records of FAIR Metrics Analyses with the NPDS Cyberinfrastructure
Brainiacs Journal 2023 Volume 4 Issue 2 Edoc D5B2734F2,
also via DOI
10.48085/D5B2734F2.
Current approaches to plagiarism detection often focus on finding lexical matches rather than semantic similarities in the text content that is compared. But the more important unanswered questions remain whether similar concepts expressed in related topical contexts are semantically equivalent as idea-laundering plagiarism by humans or algorithm-generated plagiarism by machines. Now publicly available and easily accessible, text-generating algorithms have automated the process of assembling a text derived from but not attributed to published content scraped from the web. The FAIR Metrics, with FAIR an acronym for Fair Attribution to Indexed Reports and Fair Acknowledgment of Information Records, measure how appropriately a document cites prior records based on whether they contain similar claims that are equivalent in meaning. We demonstrate herein a workflow with results for manual evaluation of the FAIR Metrics to quantify the extent of plagiarism in 8 articles retracted or reported for plagiarism. We also demonstrate use of the Nexus-PORTAL-DOORS-Scribe (NPDS) Cyberinfrastructure to manage semantic descriptions of the concept mappings and entity equivalence evaluations made using concepts and relationships from the PDP-DREAM Ontology.
-
Adam Craig, Anousha Athreya, Carl Taswell, 2023,
Example Evaluations of Plagiarism Cases Using FAIR Metrics and the PDP-DREAM Ontology
presented October 2023 at the
IEEE 19th International Conference on e-Science
in Limassol, Cyprus;
also via DOI
10.1109/e-Science58273.2023.10254806.
The FAIR Metrics, with acronym FAIR for Fair Acknowledgment of Information Records and Fair Attribution to Indexed Reports, measure how appropriately a document cites prior literature. We demonstrate use of a novel workflow for manual evaluation of the FAIR Metrics on five example publications, three of which were retracted for plagiarism. We recorded results of the analyses in Nexus-PORTAL-DOORS-Scribe (NPDS) records as an open access data set for continuing development of automated plagiarism detection tools.
-
Aniruddh Anand and Carl Taswell, 2022,
An Information-Resilient Big-Data Workbench with PDP-DREAM Software
presented October 2022 at the
85th ASIS&T Annual Meeting of the Association for Information Science and Technology
in Pittsburgh, Pennsylvania.
PORTAL-DOORS Project DREAM Software, available as an open-source C#-centric codebase from a Github public repository at PDP-DREAM, implements the PDP-DREAM principles and PDP-FAIR metrics with web-enabled workbench software for distributed data repositories in the Nexus-PORTAL-DOORS-Scribe Cyberinfrastructure. PDP-DREAM Software has been developed for Microsoft platform technologies with ASP.NET Core, SQL Server, and Internet Information Server. As a web-enabled workbench, PDP-DREAM provides many features for big data management with tools and services to support information resilience in defense of truth in science and integrity in research.
-
Adam Craig, Christina Lee, Nithyaa Bala, and Carl Taswell, 2022,
Motivating and Maintaining Ethics, Equity, Effectiveness, Efficiency, and Expertise in Peer Review
Brainiacs Journal 2022 Volume 3 Issue 1 Edoc I5B147D9D,
also via DOI
10.48085/I5B147D9D.
Scientists who engage in science and the scientific endeavor should seek truth with conviction of morals and commitment to ethics. While the number of publications continues to increase, the number of retractions has increased at a faster rate. Journals publish fraudulent research papers despite claims of peer review and adherence to publishing ethics. Nevertheless, appropriate ethical peer review will remain a gatekeeper when selecting research manuscripts in scholarly publishing and approving research applications for grant funding. However, this peer review must become more open, fair, transparent, equitable, and just with new recommendations and guidelines for reproducible and accountable reviews that support and promote fair citation and citational justice. We should engineer this new peer-review process with modern informatics technology and information science to provide and defend better safeguards for truth and integrity, to clarify and maintain the provenance of information and ideas, and to rebuild and restore trust in scholarly research institutions. Indeed, this new approach will be necessary in the current post-truth era to counter the ease and speed with which mis-information, dis-information, anti-information, caco-information, and mal-information spread through the internet, web, news, and social media. The most important question for application of new peer-review methods to these information wars should be ‘Who does what when?’ in support of reproducible and accountable reviews. Who refers to the authors, reviewers, editors, and publishers as participants in the review process. What refers to disclosure of the participants' identities, the material content of author manuscripts and reviewer commentaries, and other communications between authors and reviewers. When refers to tracking the sequential points in time for which disclosure of whose identity, which content, and which communication at which step of the peer-review process for which audience of readers and reviewers. We believe that quality peer review, and peer review of peer review, must be motivated and maintained by elevating their status and prestige to an art and a science. Both peer review itself and peer review analyses of peer reviews should be incentivised by publishing peer reviews as citable references separately from the research report reviewed while crossreferenced and crosslinked to the report reviewed.
Reports 2016-2020
-
S. Koby Taswell, Christopher Triggle, June Vayo, Shiladitya Dutta, and Carl Taswell, 2020,
The Hitchhiker's Guide to Scholarly Research Integrity
also via DOI
10.1002/pra2.223
presented with hyperlinked version and
slides
October 2020 at the
83rd ASIS&T Annual Meeting of the Association for Information Science and Technology
The pursuit of truth in research should be both an ideal in aspiration and also a reality in practice. The PORTAL-DOORS Project (PDP) strives to promote creative authenticity, fair citation, and adherence to integrity and ethics in scholarly research publishing using the FAIR family of quantitative metrics with acronym FAIR for the phrases Fair Attribution to Indexed Reports and Fair Acknowledgment of Information Records, and the DREAM principles with acronym DREAM for the phrase Discoverable Data with Reproducible Results for Equivalent Entities with Accessible Attributes and Manageable Metadata. This report presents formalized definitions for idea-laundering plagiarism by authors, idea-bleaching censorship by editors, and proposed assertion claims for authors, reviewers, editors, and publishers in ethical peer-reviewed publishing to support integrity in research. All of these principles have been implemented in version 2 of the PDP-DREAM ontology written in OWL 2. This PDP-DREAM ontology will serve as the model foundation for development of a software-guided workflow process intended to manage the ethical peer-reviewed publishing of web-enabled open access journals operated online with PDP software.
-
Shiladitya Dutta, Kelechi Uhegbu, Sathvik Nori, Sohyb Mashkoor, S. Koby Taswell, and Carl Taswell, 2020,
DREAM Principles from the PORTAL-DOORS Project and NPDS Cyberinfrastructure
also via DOI
10.1109/ICSC.2020.00044
presented February 2020 at the
14th IEEE International Conference on Semantic Computing
in San Diego, California.
The PORTAL-DOORS Project (PDP) has been pursued to develop the Nexus-PORTAL-DOORS-Scribe (NPDS) cyberinfrastructure as a distributed network system of data repositories to manage lexical and semantic data and metadata from and/or about online and offline resources. Designed with the Hierarchically Distributed Mobile Metadata (HDMM) architectural style in a manner analogous to IRIS-DNS, the NPDS cyberinfrastructure provides distributed multilevel metadata management as an open, flexible, and extensible networked system of independent community customizable who-what-where registries, directories, and diristries for identifying, describing, locating, and linking things on the internet, web and grid. In the current work reported here, we combined our original principles from PDP, HDMM, and NPDS together with additional principles for scientific reproducibility and social engineering related to our family of quantitative metrics with acronym FAIR for Fair Attribution to Indexed Reports and Fair Acknowledgment of Information Records. We call this new consolidated collection of principles the DREAM principles with acronym DREAM for the phrase Discoverable Data with Reproducible Results for Equivalent Entities with Accessible Attributes and Manageable Metadata. To codify these DREAM principles as a concrete artifact for the semantic web, and thus to operationalize their use, we developed an OWL 2.0 ontology that we named the PDP-DREAM ontology.
-
Adam Craig, Adarsh Ambati, Shiladitya Dutta, Arush Mehrotra, S. Koby Taswell, and Carl Taswell, 2019,
Definitions, Formulas, and Simulated Examples for Plagiarism Detection with FAIR Metrics
also via DOI
10.1002/pra2.6
presented with slides October 2019 at the
82nd Annual Meeting of the Association for Information Science & Technology
in Melbourne, Australia.
In prior work, we proposed a family of metrics as a tool to quantify adherence to or deviation from good citation practices in scholarly research and publishing. We called this family of metrics FAIR as an acronym for Fair Attribution to Indexed Reports and Fair Acknowledgment of Information Records, and introduced definitions for these metrics with counts of instances of correct or incorrect attribution or nonattribution in primary research articles with citations for previously published references. In the present work, we extend our FAIR family of metrics by introducing a collection of ratio-based metrics to accompany the count-based metrics described previously. We illustrate the mathematical properties of the ratio-based metrics with various simulated examples in order to assess their suitability as a means of identifying papers under peer review as more or less likely to be suspicious for plagiarism. These FAIR metrics would alert peer reviewers to prioritize low-scoring manuscripts for closer scrutiny. Finally, we outline our planned strategy for future validation of the FAIR metrics with an approach using both expert human analysts and automated algorithms for computerized analysis.
-
Shiladitya Dutta, Pooja Kowshik, Adarsh Ambati, Sathvik Nori, S. Koby Taswell, and Carl Taswell, 2019,
Managing Scientific Literature with Software from the PORTAL-DOORS Project
also via DOI
10.1109/eScience.2019.00081
presented with slides
and demo video September 2019 at the
Bridging from Concepts to Data and Computation for eScience (BC2DC'19) Workshop
of the
IEEE 15th International Conference on eScience
in San Diego, California. See also IEEE Xplore
eScience conference series proceedings.
Scholarly research associated with finding and citing scientific literature in the 21st century requires new approaches to address the continuing problems that occur with the provenance of content in the literature as well as the peer and editorial review process for publishing this literature. The PORTAL-DOORS Project (PDP) has developed software for the Nexus-PORTAL-DOORS-Scribe (NPDS) cyberinfrastructure in support of identifying, describing, locating and linking things on the internet, web and grid with both lexical and semantic tools and applications. This presentation of our PDP software will highlight Discoverable Data with Reproducible Results for Equivalent Entities with Accessible Attributes and Manageable Metadata with the DREAM principles, and the Fair Acknowledgment of Information Records also called the Fair Attribution to Indexed Reports with the FAIR metrics. This software demonstration will explain use of the network of metadata repositories for scientific literature accessible from www.portaldoors.org, and use of the open source software that powers the NPDS cyberinfrastructure, PDP websites and PDP web services. Our PDP software for the NPDS cyberinfrastructure will be released publicly at this presentation of the software where we will also discuss challenges in the peer review process that include plagiarism detection.
-
Adam Craig, Adarsh Ambati, Shiladitya Dutta, Pooja Kowshik, Sathvik Nori, S. Koby Taswell, Qiyuan Wu, and Carl Taswell, 2019,
DREAM Principles and FAIR Metrics from the PORTAL-DOORS Project for the Semantic Web
also via DOI
10.1109/ECAI46879.2019.9042003
presented with slides June 2019 at the
11th Annual IEEE International Conference on Electronics, Computers and Artificial Intelligence
in Pitesti, Romania.
Articles published in Scientific Data by Wilkinson et al. argued for the adoption of the Findable, Accessible, Interoperable, and Reusable (FAIR) principles of data management without citing any of the prior work published by Taswell. However, these principles were first proposed and described by Taswell in 2006 as the foundation for work on the PORTAL-DOORS Project (PDP) and the Nexus-PORTAL-DOORS-Scribe (NPDS) cyberinfrastructure, and have been published in numerous conference presentations, journal articles, and patents. This work on PDP and NPDS has been continuously available since 2007 from a publicly accessible web site at www.portaldoors.org, and discussed in person at conferences with several key authors of the Wilkinson et al. papers. Paraphrasing without citing the PDP and NPDS principles while renaming them as the FAIR principles raises questions about both the ‘FAIRness’ and the fairness of the authors of the Wilkinson et al. papers. Promoting these principles with the use of the term ‘metrics’, which are not metrics by definition of the term metric as used in most fields of science, also raises questions about their commitment to maintaining consistency of usage for basic terminology across different fields of science as should be expected for terms in ontology mapping with knowledge engineering for the semantic web. Therefore, in the present report, we clarify the origin of their FAIR principles by identifying our PDP and NPDS principles that constitute the historical precedent for their FAIR principles. Moreover, as the comprehensively summarizing phrase for all of our PDP and NPDS principles, we rename them the DREAM principles with the acronym DREAM for Discoverable Data with Reproducible Results for Equivalent Entities with Accessible Attributes and Manageable Metadata. Finally, we define numerically valid quantitative FAIR metrics to monitor and measure the DREAM principles from the perspective of the most important principle, ie, the Fair Acknowledgment of Information Records and Fair Attribution to Indexed Reports, for maintaining fair standards of citation in scholarly research and publishing.
-
Adam Craig and Carl Taswell, 2018,
Formulation of FAIR Metrics for Primary Research Articles
also via DOI
10.1109/BIBM.2018.8621399
presented December 2018 at the
SEPDA Workshop held at the
IEEE 2018 BIBM Conference
in Madrid, Spain.
Measuring the merits of a scholarly article only by how often other articles or social media posts cite it creates a perverse incentive for authors to avoid citing potential rivals. To uphold established standards of scholarship, institutions should also consider one or more metrics of how appropriately an article cites relevant prior work. This paper describes the general characteristics of the FAIR Attribution to Indexed Reports (FAIR) family of metrics, which we have designed for this purpose. We formulate five FAIR metrics suitable for use with primary research articles. Two measure adherence to best practices: number of correctly attributed background statements and number of genuinely original claims. Three measure specific deviations from best practices: number of misattributed background statements, number of background statements with missing references, and number of claims falsely indicated as original. We conclude with a discussion of plans to implement a web application for calculating metric values of scholarly works described by records in Nexus-PORTAL-DOORS System (NPDS) servers.
-
Adam Craig and Carl Taswell, 2018,
The FAIR Metrics of Adherence to Citation Best Practices
with poster
presented November 2018 at the SIGMET Workshop
Metrics 2018 held at the
2018 ASIS&T Annual Meeting
of the Association for Information Science & Technology
in Vancouver, British Columbia.
Measuring the merits of scholarly research articles only by citation counts and how often other research articles or social media messages cite a particular publication creates a perverse incentive for some authors to refrain from citing potential rivals. This dilemma has developed despite the historical publishing standard expected in peer review for citing and discussing related prior work. To encourage and support a countervailing incentive, research organizations should also consider metrics for how well and appropriately a scholarly article cites relevant prior work in the spirit of the classic phrase and metaphor standing on the shoulders of giants. We present a proposal for a family of such article-level metrics called the FAIR metrics and described as the FAIR Attribution to Indexed Reports or the FAIR Acknowledgment of Information Records.