Addressing the Continued Circulation of Retracted Research as a Design Problem
Nathan D. Woods, Postdoctoral Research Fellow, University of Lethbridge.
Jodi Schneider, Assistant Professor of Information Sciences, Assistant Professor of Information Sciences, University of Illinois at Urbana-Champaign
The RISRS Team
Abstract
In this article, we discuss the continued circulation and use of retracted science as a complex problem: Multiple stakeholders throughout the publishing ecosystem hold competing perceptions of this problem and its possible solutions. We describe how we used a participatory design process model to co-develop recommendations for addressing this problem with stakeholders in the Alfred P. Sloan-funded project, Reducing the Inadvertent Spread of Retracted Science (RISRS). After introducing the four core RISRS recommendations, we discuss how the issue of retraction-related stigma gives rise to recommendation #4, Educate stakeholders about retraction and pre- and post-publication stewardship of the scholarly record. This recommendation is important for training publishing professionals and realizing this recommendation will require further collaborative design work across scholarly communications. We highlight ongoing stakeholder work which is now re-starting the design cycle. We conclude with a discussion of ongoing activities facilitating uptake and refinement of RISRS research and the implementation agenda.
1. Introduction
In this paper, we present a case study of a multi-stakeholder, action-oriented research project related to ethics in publishing. The project, Reducing the Inadvertent Spread of Retracted Science: Shaping a Research and Implementation Agenda (RISRS), was funded by the Alfred P. Sloan Foundation from 2020-2022. As its name suggests, the goal of the project was to develop an actionable agenda for handling the continued spread of retracted research.
This case study has two aims. First, it presents one possible process model for stakeholder-engaged projects in scholarly publishing. This highlights our thinking in developing the project, based on concepts such as sociotechnical systems (Topi, Tucker, and Tucker 2014) and wicked problems (Rittel and Webber 1973), which we believe may be broadly relevant to scholarly publishing. Second, it presents selected outcomes of the project, with particular attention to two recommendations relevant to publishing professionals.
Below, we first describe retraction as a social and technical problem; the continued citation and use of retracted publications; and how wicked problems can be addressed through a participatory design process model. Then, we discuss our case study and how it instantiated the participatory design process model. We then discuss the findings from and outcomes of our case study, with attention to the problem redefinition process which led to our core recommendations. We then detail one particular recommendation regarding the need for stakeholder education, which arose from reframing the relationship between stigma, prestige and stewardship of the scholarly record. We close the paper by discussing ongoing activities facilitating uptake of RISRS research and the implementation agenda, limitations, and a summary of the paper.
2. Retraction as a Social and Technical Problem
Retraction apprises readers about unreliable material, effectively removing from the scholarly record (while generally leaving publicly accessible) articles that are deemed to be unreliable or seriously flawed whether due to honest error or misconduct.
Since the 1980’s, retraction of scientific articles has been publicly recognized as a problem affecting the integrity of the scientific record (Lewin 1989; Culliton 1988; Shapiro and Charrow 1989), often framed as driven by falsification, fabrication, and/or plagiarism. In short order, several efforts to ameliorate misconduct followed: in the U.S. Congress (Anderson 1992b; 1992a); through the efforts of editors to innovate in publishing practice (Woolf 1987; Ancker 2004); and more recently, through clarifying the role of research integrity offices in the retraction process (Collaborative Working Group from the conference “Keeping the Pool Clean: Prevention and Management of Misconduct Related Retractions” 2018; Wager et al. 2021). Since 2010, in a direct public intervention, the blog Retraction Watch, has reported on retraction, and in October 2018, Retraction Watch released a comprehensive database of all known retractions (Brainard 2018), which, as of late 2021, contains more than 30,000 items (Retraction Watch 2021). Retraction Watch has broadened public understanding of retraction as a phenomenon by highlighting the limited publicity that many retractions received, the widespread impacts of retraction, and the continued circulation of retracted materials.
3. Continued Citation and Use of Retracted Research
Retracted publications may be perpetuated into the scientific publication network via citations and distribution of the publication, both before and after retraction, which inadvertently propagates the reliance on publications that have been deemed as problematic. Citations to retracted publications should document that retracted status. Continuing to cite retracted publications without that awareness can propagate errors for generations of scientific literature (Schneider et al. 2020; van der Vet and Nijveen 2016). It can also cause problems in evidence synthesis, when retracted publications are cited as evidence in systematic reviews (Gray et al. 2018). The guidelines of the 2019 Committee on Publication Ethics (COPE) mention that systematic reviews must consider correction or retraction following the retraction of a publication they synthesize (COPE Council 2019), since retraction may change the evidence base (Wiedermann 2018).
We distinguish two aspects of citation: whether or not citing authors were aware of the retraction and whether a citation is positive or negative. Negative citations[1] and citations documenting retraction status(es) only account for a small fraction of post-retraction citations. In a large-scale study of the PubMed Central database, Hsiao and Schneider found that only 6% of post-retraction citations in biomedicine showed awareness of the retraction (Hsiao and Schneider 2021). Citations that are not retracted at the time of manuscript authoring could subsequently become retracted; and generally, authors may not be aware of the retraction.
Researchers do find legitimate reasons to cite retracted publications (Hsiao and Schneider 2021); most commonly to provide related work but also for other reasons, including to provide an example of problematic science, to describe work being reproduced, or to justify exclusion from a systematic review or meta-analysis.
More often, however, citations do not show awareness of the retraction. To address this situation, Fu and Schneider (2020) developed a formal approach called the keystone framework, which combines argumentation theory, argument-based modeling of a scientific publication, and citation content analysis. It enables users to differentiate citations that do not impact the validity of the citing paper from citations that do impact the validity of said paper. In the former case, a mark can be placed next to the citation so that readers will be informed of the potential validity issues with the citation contexts. However, in the latter case, the validity of the entire citing paper is called into question. Additional measures (e.g., alerting authors to double-check their results) need to be taken to prevent science being built on “shaky” or “absent shoulders” (Azoulay et al. 2015).
High-profile retractions are more likely to be explicitly marked or used appropriately, when cited. Two COVID-19-related papers that were quickly retracted from the Lancet and the New England Journal of Medicine were heavily cited; but investigative journalist Charles Piller deemed only about half of the citations inappropriate (Piller 2021). The Wakefield autism/MMR paper was partially retracted in 2004 and fully retracted in 2010; most citations are negative (Suelzer et al. 2019) and more recent citations are more likely to mention the retraction (Suelzer et al. 2019). There is also limited evidence that, in high-profile retraction cases, publicity reduces post-retraction citation (Mott, Fairhurst, and Torgerson 2019).
Researchers have speculated that, beyond mass media attention, some features of the information environment may impact the extent to which researchers notice retraction. Balhara & Mishra (2014) found that the lack of a freely available retraction notice led to a statistically significant increase in post-retraction citations. In a case study of one retracted paper that has been repeatedly cited 11 years after its retraction for data falsification, Schneider et al. (2020) documented the difficulty of finding the retraction notice using database metadata and library link resolvers. Moreover, researchers outside a field tend to be more prone to citing retracted publications than researchers inside the field (Bornemann-Cimenti, Szilagyi, and Sandner-Kiesling 2016). Even when retractions are well-marked on the publisher’s website, there is significant variation in display (for examples, see Appendix A, page 6 in Suelzer, Deal, and Hanus 2020; Suelzer et al. 2021).
If authors copy references from others without checking the original paper or retrieving papers from unofficial channels (e.g., pirate copy sites, self-archives, academic social networks) (Dubin 2004; Simkin and Roychowdhury 2005; Wetterer 2006), merely improving the visibility of retraction status(es) on publishers’ websites or databases will still be inadequate. Education of authors is important. Moreover, checking references during the publication process could identify citations to retracted publications; and ensure that, if cited, authors are aware of and document the retraction (Davis 2012). This could have a significant impact since, on average, retracted publications receive 22-35 citations each (both pre- and post-retraction) (Chen et al. 2013; Dinh et al. 2019; Pantziarka and Meheus 2019). Some retracted publications receive hundreds or thousands of post-retraction citations (“Top 10 Most Highly Cited Retracted Papers” 2015).
4. Addressing Wicked Problems with a Participatory Design Process Model
The RISRS project approached the problem of retraction, and the continued citation of retracted materials, as a “wicked problem.” Wicked problems are social or cultural problems with many interdependent factors, often with disagreement over the nature of the problem (Hoffmann 2020). Wicked problems may be difficult, if not impossible, to solve because of incomplete or contradictory knowledge; the number of people and divergent opinions involved; the economic cost of proposed solutions; and their enmeshment with other perceived social and technical problems (Rittel and Webber 1973).
Addressing wicked problems involves bringing the disparate body of people and organizations affected by the problem together to build the will to shift perceptions of the problem. It also involves clarifying incentives for stakeholders to work with change processes, as their perceptions of wicked problems shift with re-evaluations of risk, changing values, and in response to emergent policy and technical resolution processes.
In a wicked problem, collaboration across major stakeholder groups may be challenged by the lack of common agreement about the scope of the problem or the efficacy of strategies to address the issue. Problem definition is a long-studied issue in the social sciences; and a practical problem in the study of policy and planning, where problem definitions set the tone for ideas to move through processes of agenda setting, policy development, adoption, implementation, planning and evaluation (Baumgartner and Jones 2015; Kingdon and Stano 1984; Weiss 1989). How groups define the problem to be solved defines a horizon of possibility and expectation for proposed solutions, hence, opportunities for intervention. Although problem definitions shape the perceived scope of an issue, the perception of problems and their resolution(s) are often informed by incommensurate priorities. How a problem is scoped and framed may change following reprioritization or efforts to implement solutions. A common strategy for addressing wicked problems is to design a process of problem refinement and prioritization whereby stakeholders are asked to reflect on their values, goals, and background assumptions (Rosenhead 1996; Horn and Weber 2007) to help redefine the scope of a perceived problem in a new way.
The RISRS project sought to consolidate an actionable research and implementation agenda built from stakeholder input. Our goal was to understand variation and overlap in how the continued circulation of retracted research was perceived from different stakeholder vantage points: How does the issue look, for example, from the perspective of publishers or editors compared to researchers or integrity officers? We consolidated stakeholder insights into a continuum of perceived problems and opportunities; and asked stakeholders to work together to prioritize major goals, recommendations, and opportunities for implementing proposed solutions. This work became the basis for the RISRS report (Schneider, Woods, et al. 2021c), which outlines the consensus of stakeholder recommendations, short- and long-term, as well as a research and implementation agenda. We loosely modeled the agenda-setting cycle of the RISRS process on a five-stage participatory agenda-setting model (Rosa, Gudowsky, and Warnke 2018; Abma and Broerse 2010) involving exploration, engagement, prioritization, integration, and dissemination (illustrated in Fig. 1). The purpose of this process is to synthesize stakeholder feedback to create an actionable agenda to guide research and change activities.
Figure 1: Decision-making process according to the TRANSFORM participatory agenda-setting project (TRANSFORM Project n.d.)
5. Applying the Participatory Design Process Model to Scholarly Publishing: A Case Study from the RISRS Project
RISRS was designed as an 18-month process with a stakeholder workshop at the halfway point. This structure drew on our colleagues' previous work, hosting a national forum about text mining with limited access text (Sandor Namachchivaya 2017; “Data Mining with Limited Access Text: National Forum” n.d.; Senseney et al. 2021). Planned pre-workshop activities included a literature review, a citation analysis, stakeholder interviews, and stakeholder statements to seed the workshop discussion. Planned post-workshop activities focused on creating and disseminating coherent, actionable recommendations for next steps.
In designing the RISRS process, we mapped the basic structure of TRANSFORM to our work with scholarly communications stakeholders. Accordingly, the design process moved through five stages (see Figure1). We began with an exploration phase, which consisted of preliminary literature review and the formation of a stakeholder advisory board. This was followed by a stakeholder engagement and consultation process, where stakeholders were individually consulted regarding problems within their own domain of expertise. Here, stakeholders were encouraged to reflect on strategies for mitigating the issue; encouraged to reflect upon and actively provide feedback on why the continued citation of retracted research is a problem; and how it can be solved. We used this information to design the “prioritization/refinement” and “integration” phases of the process, by creating structured opportunities for reflection and discussion during a series of stakeholder workshops. Due to the COVID-19 pandemic, the workshops were ultimately held online in three parts (October 26, November 9, and November 16, 2020). The “dissemination” phase of the process began shortly after the workshops with a series of presentations from February through November 2021. Said presentations included talks at NISO Plus 2021 (Bakker et al. 2021; Avissar-Whiting et al. 2021), the Society for Scholarly Publishing 2021 (Flanagin et al. 2021), the 2021 annual virtual seminar for the Committee on Publication Ethics (Bilder, Fanelli, and Schneider 2021), the International Society of Managing and Technical Editors (Oransky and Schneider 2021), and the Charleston Conference (Aalbersberg, Lehmann, et al. 2021). Dissemination and implementation remain ongoing as new stakeholder-initiated iterations of the design process develop. Below, we detail the methods that we used in each phase of this participatory agenda-setting process.
Mapping the RISRS process to the TRANSFORM participatory agenda-setting project
The RISRS process used multiple methods to design and organize the overall process and to facilitate participation, small group dynamics, and stakeholder co-production. Additionally, the RISRS agenda-setting process was supported continuously by the active research of the RISRS team. Here, we map out each element of these various methods relative to the stages depicted in the TRANSFORM dialogue process, which served loosely as our major map for the overall organization of the process.
Exploration
At the beginning of the process, we conducted a preliminary scoping review to systematically identify the literature on retraction, identify what we know about retraction from the empirical research literature, to clarify what has been studied, and how it has been studied. We prioritized areas with direct and obvious relevance to the RISRS’ problems, such as citation and visibility of retraction status, and focused initially on 162 central papers (RISRS Bibliography v.1 2021).
Engagement/Consultation Activities
70 stakeholders participated in the overall consultation process, in various ways.
Advisory Board
An advisory board composed of leaders in the field of scholarly communication helped identify and attract stakeholders from diverse fields related to scholarly communication and ensured that people with diverse perspectives were invited to participate. Advisory board members were: Annette Flanagin, Executive Managing Editor and Vice President, Editorial Operations, JAMA and The JAMA Network; C.K. (Tina) Gunsalus, JD, Director, National Center for Professional & Research Ethics, University of Illinois at Urbana-Champaign; Daniele Fanelli, PhD, Fellow in Quantitative Methodology, Department of Methodology, London School of Economics and Political Science; and Ivan Oransky, MD. Co-Founder of Retraction Watch & Editor in Chief of Spectrum.
Stakeholder Enrollment
The RISRS stakeholder consultation played a central role in identifying problems, possible solutions, and collaborative implementation strategies. Stakeholders were actively consulted throughout the RISRS process, contributing to ongoing rounds of feedback, integration, and dissemination, with the aim of introducing change into the scientific publishing ecosystem. Stakeholder dialogue and synthesis has been a key to deriving a clear understanding of how retracted science is understood in different professional and sector domains. To facilitate this process, the RISRS team designed a stakeholder enrollment and consultation process and asked stakeholders to help develop a working ecosystem map of concrete actions needed to support cross-sectoral collaboration, which will ultimately help identify pathways for effective implementation.
Stakeholders were invited to engage with the project in July of 2020. Broad inclusivity measures were built into the invitation of potential participants. All stakeholders were invited on the basis of their professional expertise, role(s) in publishing, research, information technology, university, and government. However, our efforts for racial and gender diversity were limited by structural factors. In particular, the scholarly communication industry as a whole is not racially diverse; 80-90% of respondents to recent surveys have self-identified as White (Greco, Wharton, and Brand 2016; Taylor et al. 2020). Even so, White people are overrepresented in leadership. Likewise, although women comprise a majority of the scholarly communication industry, they are underrepresented in board and C-suite positions (Greco, Wharton, and Brand 2016; Michael 2017).
Stakeholders were invited to participate in the RISRS project in multiple ways, and were given the choice to participate in an hour-long interview, to contribute an original position paper, to be included on the project’s website, and/or to participate in an online workshop series.
Stakeholders from the scholarly communications ecosystem were enrolled in this process; this included funders, editors, peer reviewers/authors based at both universities and government research facilities, commercial and scholarly publishers, individual researchers, librarians, platform and database providers, software developers, metadata experts, university research integrity officers, lawyers, science journalists, staff at professional organizations, and members of standards-setting organizations.
Stakeholder Interviews
47 stakeholders from the scholarly communications ecosystem were interviewed. During the interviews, stakeholders were asked about their experience with retracted research, their opinions and attitudes towards the variety of retractions, and the harms associated with retraction, as well as any perspectives or experiences that they might offer related to retracted research, its continued citation, and the work of maintaining or correcting the scientific record.
Prioritization/Refinement
By identifying problems, and iteratively brainstorming about problems and opportunities, stakeholders also identified obstacles and pathways to implement particular solutions. We drew upon elements from expert forecasting or group facilitation methods to help structure stakeholder feedback and problem solving. This included problem-structuring methods (Mingers and Rosenhead 2004) during the prioritization phase. Workshop activities drew on Liberating Structures (Lipmanowicz and McCandless 2014; n.d.) such as the “15% Solution” brainstorming activity (Lipmanowicz and McCandless n.d.) which invited stakeholders to identify areas where they already had the ability to take action, and the “Min Specs” collaborative agenda-setting exercise (Lipmanowicz and McCandless n.d.) which asked stakeholders to imagine how to implement particular recommendations and then rank implementation ideas by priority. In this sense, the agenda-setting cycle described above was an ongoing feature of the overall RISRS design process.
Each workshop session was developed around a particular group task: Day 1 focused on listening and learning about stakeholder experience with retraction from a variety of participants; Day 2 on collaborative agenda-setting, where stakeholders prioritized problems and opportunities; and Day 3, which focused group conversation on implementation topics such as barriers to cooperation and sustaining commitment to act in the short- and long-term.
Integration
In the integration phase, stakeholders from different backgrounds and with different perspectives were brought together to participate in structured dialogues, working to synthesize across a variety of proposed topics and solutions through discussion, argumentation, and problem refinement activities. Our recommendations were built from this process and were iteratively updated and developed through a series of surveys and public drafts (Schneider, Woods, et al. 2021a; 2021b) as well as at a virtual follow-up meeting on February 16, 2021. These recommendation development and synthesis cycles were based loosely on a Delphi approach to qualitative synthesis (Fletcher and Marchildon 2014).
Additionally, to help guide sense-making and problem structuring activities, results from our qualitative analysis were built into the structured conversations that organized the three-day workshop and integrated throughout the agenda-setting cycle.
Qualitative analysis occurred in three stages. Interview transcripts and other documents were coded in three stages, culminating in a thematic analysis of the text (Braun and Clarke 2006). Additionally, and parallel to the interview coding, articles from the preliminary literature review were to identify problems and opportunities in the literature. Interview and document coding resulted in 41 distinct codes associated with problems and 38 distinct codes associated with opportunities for addressing the problem of retracted research in the scholarly communications ecosystem.
These analytic codes were written up using illustrative quotes that exemplified elements of the themes. This document was circulated prior to the workshop as a type of “member check,” or respondent validation (Birt et al. 2016; Candela 2019). The coding sets and themes were subsequently used to enrich the analysis of materials; to create a composite portrait of how retraction and the continued citation of research is framed as a problem; how it is encountered within discrete roles in the scholarly communications ecosystem; or how it relates to broader assumptions about the purpose of publishing, the composition of the scholarly record, and the meaning of retraction.
In the workshops, we built a bridge between prioritization and problem refinement. In the prioritization phase of the cycle, the qualitative analysis derived from the consultations and preliminary literature reviews were initially presented to stakeholders for feedback in Workshop 1 described below. Workshops 2 and 3 initiated the integration phase, where stakeholders were asked to refine and prioritize the identified problems and opportunities, and to nominate new ones where relevant. Finally, in the dissemination phase, stakeholders were involved in several rounds of direct feedback on the elaboration of recommendations that contributed to the final RISRS report.
Dissemination
Through stakeholder dialogue, problem refinement, and prioritization, aspects of an evolving agenda move on to phases of “integration,” where stakeholders further develop the agenda in sector-specific, meaningful ways; and “dissemination,” where the agenda and its supporting elements may be further developed in implementation coalitions. In some cases, aspects of the evolving stakeholder agenda may move through further rounds of development. This iterative process not only helps develop policy agendas with stakeholder investment but also works to help address problem re-definition, an acknowledged feature of wicked problems (Rittel and Webber 1973; Jentoft and Chuenpagdee 2009).
Restarting the Cycle: Starting Another Round of Participatory Decision Making
Simple problems are those which are already defined whereas wicked problems are seemingly intractable and indeterminate because stakeholders cannot agree on a definition. Indeed, given the indeterminate nature of wicked problems, attempts to solve the problem may result in further complications, if not more problems (Roberts 2000; Sherman and Peterson 2009). For example, in agenda-setting processes, the first stage of problem refinement may be undermined through the efforts of stakeholders to actively redefine the problem to be solved — behavior which may overpower previous problem definition and agenda-setting work (Wood and Doan 2003). Wicked problems “never end” (Rittel and Webber 1973; Vermaas and Pesch 2020). As wicked problems are linked to other problems, solutions often have unexpected consequences over time. Design processes have been particularly effective at taming wicked problems (Rith and Dubberly 2007; Buchanan 1992) by explicitly engaging stakeholders in collaborative refinement of perceived problems and the dynamics of problem redefinition. The RISRS process has begun the work of addressing a complex issue through stakeholder engagement and problem refinement, resulting in a series of action-oriented recommendations. As these recommendations are developed by stakeholders, further design processes will be required to help build effective and meaningful implementation.
6. Findings from and Outcomes of the Case Study
Multiple Interlocking Problems
Shortly after we started talking to stakeholders, the RISRS Team discovered that we were not working with one single problem, but rather a confluence of multiple problems, all loosely related to the continued spread and use of retracted research. Some interviewees framed the issue around individual misconduct or accountability while others focused on breakdowns in editing and publishing processes. Still, others described the need for engineering solutions across multiple overlapping systems. These three frames for retraction — moral failings, publishing process issues, or technical coordination problems — each imply different approaches for successfully mitigating the perceived problems. Refined publishing processes, for example, may not have traction if authors are reluctant to engage in dialogue with editors around post-publication amendments for fear of the stigma associated with perceived moral failings. Likewise, technical solutions, such as reference management checks for retractions or automated reviews of manuscripts during the submission or publication process, will not be effective unless researchers or publishers are incentivized to use them. By designing opportunities for stakeholder participation and interaction, we have tried to clarify differences in perception as well as identify areas of agreement over both the problem definition and perceived solutions.
During Workshop 1, it became apparent that the group was not only contending with different problem definitions but also with competing perspectives on the ultimate goal: Are we trying to clean up the literature? Or are we trying to reform science? Or are we trying to reform science publishing? These may be overlapping goals but the scale of interventions and the actions stakeholders should take to address them would depend on which of these overlapping goals was taken as primary. That meant that we needed to first clarify the scope, problem, and goals. As our conversations evolved, we focused on actionable recommendations and sought to draw out intersections between the goals rather than to focus on achieving one or another. To this end, the RISRS recommendations focus on opportunities to cooperate across professional roles and domains.
Design approaches are particularly useful in this situation, where problems are multi-faceted and where solutions involve multiple overlapping efforts. The problem definition and the scope of the perceived problem impact the space of possible solutions and determine the resources and collective action required to address the problem. A design perspective focuses on learning about the nature of a problem, often working backwards, from problems to solutions, without strongly preconceived plans or solutions or iteratively linking actions with new goals as the problem definition evolves.
By taking up the issue of problem definition directly, particularly as it relates to the perception of retraction as a social and technical issue in the scholarly communications ecosystem, we could narrow areas of divergence and intersection around potential areas of cooperation or collaboration. The benefit of this design approach is that it allows both researchers and stakeholders to understand the various dimensions attributed to the problem and to begin shifting from at-hand solutions to understanding how the problem could be understood in more collaborative terms.
7. Reframing the Problem: The First Round of Recommendations
Here we provide an overview of our core recommendations (Schneider, Woods, et al. 2021c), a major outcome of the process, by illustrating how the recommendations evolved; and through this process, reframe what types of actions could be bundled to address the continued citation of retracted research. We conclude with a discussion of one of these recommendations salient to the education of publishing professionals.
Our core recommendations (Schneider, Woods, et al. 2021c) are:
- Develop a systematic cross-industry approach to ensure the public availability of consistent, standardized, interoperable, and timely information about retractions.
- Recommend a taxonomy of retraction categories/classifications and corresponding retraction metadata that can be adopted by all stakeholders.
- Develop best practices for coordinating the retraction process to enable timely, fair, unbiased outcomes.
- Educate stakeholders about publication correction processes including retraction and about pre- and post-publication stewardship of the scholarly record.
We identified these core recommendations which were supported by a minimum general consensus from participants in the RISRS workshops. We prioritized recommendations for which there exists momentum to address the issue; known examples that can be used to model standards or best practices; current technologies that can be adopted; and proposals for which there is existing or strong agreement. Additionally, for each recommendation, we identified a list of supporting actions, derived from stakeholder input, as well as suggested pathways to effective implementation and areas for future research. The research and implementation agenda (Schneider, Woods, et al. 2021c) in the full RISRS report has 20 pages of detailed suggestions for collaborative action based on the four core recommendations.
8. Moving the Needle: Encouraging and Facilitating Uptake of the RISRS Research and Implementation Agenda
New bodies of stakeholders have now formed coalitions that are invested in further co-development of two of the recommendations. The first stakeholder action, established in the COPE orbit, concerns building an industry-supported taxonomy for identifying types of retractions in the publication process (Flanagin et al. 2021). In the National Information Standards Organization (NISO) orbit, a working group has been formed to develop a standards-setting process for the communication of retractions, removals, and expressions of concern (National Information Standards Organization 2021). Additionally, through research and organizing efforts, RISRS continues to support continued stakeholder elaboration of the basic recommendations from the RISRS report, as well as outlining and co-developing new avenues for implementation and problem scoping. For each of the problems associated with these recommendations, the RISRS design process created the context for stakeholder dialogue and problem refinement to shift the context of how perceived problems and solutions are organized. The agenda-setting cycle will continue for recommendations that have been adopted for further development. The issue of stakeholder education regarding retraction and pre- and post-publication stewardship of the scholarly record is still looking for an invested coalition for further development. This recommendation ties some of the problems associated with stigma and retracted research together.
9. From Stigma to Stewardship: The Need for Stakeholder Education
The issue of retraction-related stigma, which arose early in the interviews with stakeholders, is one of the most important topics that needs further stakeholder development and a home for said development. Here we discuss the issue of stigma as a particularly evocative example of an embedded, nested problem and how perception of the problem may be reorganized to encourage action. From our interviews, stakeholders associated retraction —and other post-publication amendment processes— with misconduct; with questions of litigation and responsibility; and issues of prestige. Often, this cluster of issues was framed as an impediment to productive cooperation in the publishing process and as an obstacle to effectively address the continued citation of retracted materials. Stigma is broadly relevant to the publication ethics community: along with beliefs about the conduct of science, issues of stigma and prestige form a nested set of beliefs and attitudes about retraction that shaped how the continued citation of retracted research, as a sociotechnical problem, is disparately perceived at a very general level by different groups of stakeholders. For some of the stakeholders we consulted, the continued citation of retracted research is perceived to result from misconduct in the conduct of science: getting rid of misconduct would alleviate these problems in the literature. To prevent the dissemination of retracted materials, in their view, the most efficient course of action is to root out misconduct, either by strengthening the social norms of science, emphasizing penalties for norm deviation, or by transforming the culture of science to address issues of prestige and productivity which many identify as the root of scientific misconduct. For others, the solution to the continued citation of retracted research is to clean up the literature, where retraction is but one tool of many in the process of post-publication amendment. Here the issue is not to reform science, but rather to effectively communicate post-publication amendments and to address how the scientific literature is used.
These differences in perception were at play in the RISRS problem-refinement workshops; they mark important stakeholder positions on problems, opportunities, and mitigation strategies. However, these are not mutually exclusive issues, but rather related social issues for which there is not a clear technological solution. While stigma is often associated with negative impacts for researchers, in our interviews with and in the collaborative workshops with stakeholders, we found that issues of stigma and prestige also affect those working in publishing. For example, researchers who discover mistakes in their published work and who feel inspired to take action to “clean up the literature,” often face multiple obstacles in doing so. Authors worry about the stigma associated with retraction and the possible negative effects on their careers. Editors also worry about the stigma of retraction and its association with misconduct. They may feel reluctant to retract because of the perceived negative perceptions associated with retraction; and the effects that this may have on the way prestige is attributed to their journal, or on their career as editors. Furthermore, publishers and editors worry about how retractions open them up to litigation. For smaller publications stewarded by societies, libraries, and/or individual researchers, retraction and post-publication amendments may be perceived as costly in terms of time and money, as well as risky, in terms of reputation. By contrast, researchers may feel reluctant to report mistakes because of the perceived career risks associated with retraction and the potential judgements they may face from peers, collaborating colleagues, and their home institutions. Addressing the issue of stigma was a priority for the stakeholders attending the RISRS workshops; and there was a broad consensus that one pathway forward on this problem is to engage with stakeholder education.
By teasing out how stigma operates across professional domains, we came to understand how, as a wicked problem, the continued circulation of retracted research is anchored in multiple social and technological contexts and conditions. Here we highlight the role stigma plays in the scholarly communications ecosystem to illustrate how the work of problem-refinement can help encourage action, by re-organizing a broadly distributed perception of risk into opportunities for further collaboration.
An example of this refinement and reframing is the RISRS recommendation: Educate stakeholders about retraction and pre- and post-publication stewardship of the scholarly record. The full recommendation covers actions that researchers, authors, topic editors, and other publishing professionals can take to educate themselves about the continued citation of retracted research. It also includes areas where these stakeholders can contribute to a collaborative effort to develop stakeholder-specific education materials (Schneider, Woods, et al. 2021c). In some cases, the issue is expanding through educationthe reach of existent resources. For editors and publishers, the most relevant starting point is the COPE. COPE's eLearning Module, “Introduction to Publication Ethics” is freely available on the internet (Committee on Publication Ethics n.d.). COPE also offers editorial flowcharts to help guide editorial decision-making in publication ethics; and many are available in multiple languages. However, based on stakeholder feedback, additional materials are needed to provide guidance for other stakeholders navigating the process of retraction, particularly for the variety of post-publication amendments, to help guide the use of amended and retracted material circulating in the scholarly communications ecosystem.
Additionally, we identified a need for all editors and publishers to be aware of “honest retraction,” which Retraction Watch refers to as "doing the right thing" (Retraction Watch n.d.). Not all retraction is associated with misconduct. Pervasive error can also be a reason for retraction, whether requested by the author or when a reader discovers a pervasive mistake or error in the data.
During our consultations, stakeholders conveyed a strong sense that the value of retraction is in “cleaning up the literature,” but that this is made more difficult due to the stigma associated with retraction. Well-known cases of honest retraction are often championed as examples of science as a “self-correcting” practice (Fanelli 2016; Retraction Watch n.d.; The Editors of The Lancet 2015). However, while these examples are often celebrated, there are impediments to honest retraction at multiple points in the publication chain (Alberts et al. 2015; Rohrer 2021; Vorland et al. 2020). Such challenges were mentioned in our interviews with stakeholders. Some authors seeking to do an “honest retraction” need funding to help offset the possible effects on the careers of their trainees and other members of their lab; some authors report that they need to persuade an editor to pursue an “honest retraction” of their own paper. Because retraction may be (incorrectly) taken to imply misconduct, stakeholders voiced concerns about the negative impact retractions could have on graduate students and early-career researchers, as well as lab managers and collaborators, even when there is no misconduct or when they are not responsible for the misconduct. This illustrates the types of obstacles for author-initiated post-publication amendments and some of the impediments to the widespread development of a robust, broadly distributed, ethic of stewardship for the scientific and scholarly record.
This recommendation is the outcome of stakeholder-informed co-development. During our workshops, there was acknowledgement that the issue of stigma was complex. To be properly addressed, stakeholders would need to work together to speak to its importance and collaborate with their peers to address the needs of specific groups. Publishing professionals can help reduce the stigma of retraction by assisting in the creation of educational materials that link the variety of post-publication amendment processes to real world examples and to illustrate that not all retraction is due to misconduct. Likewise, these educational materials should reflect the challenges that researchers and others working in the scholarly communications ecosystem face in active stewardship of the scholarly record, such as time constraints and unfamiliarity with the publishing process. Additionally, to help contextualize the strong association between post-publication amendment and misconduct, there is some acknowledgment that discussion of publication ethics and post-publication amendments be incorporated into Responsible Conduct of Research education.
In the future, retraction should be understood as standing on a continuum of post-publication amendment —continuous with corrections, versioning, etc.— and functioning ultimately to help uphold the value and quality of the peer-reviewed literature. Education across multiple roles in and around scholarly communication is needed, such as for researchers, research managers, scholarly and professional societies, institutional officers, literature users and the broader research ecosystem, including clinicians and other practitioners, policy experts, regulators, corporate actors, editors, preprint managers, peer reviewers, and publishers. As with our other recommendations, RISRS recommendation #4 will require further iterative development by an invested stakeholder coalition.
10. Limitations
We focused on the English-language publishing community; and our attendees were largely drawn from Europe and North America due to the logistics of hosting online meetings with people from a variety of time zones. Stakeholders were primarily from universities, government, large scientific societies, large publishers, technology organizations, and funders. We had trouble identifying small publishers and Global South publishers; we also did not seek to identify industry researchers. Retraction has been addressed in the medical community for longer and with more emphasis than in the wider industry; other fields in the sciences and social sciences have occasionally been the subject of empirical research on retraction, however very little research has examined humanities-related retractions (Proescholdt et al. 2021).
11. Conclusions
Design processes can be helpful for problems that seem like “stuck” problems, for which there is no agreed upon solution or which require the collaboration of a great number of people to find appropriate solutions (Rittel and Webber 1973). In this paper, we have outlined our process for moving the needle on one stuck problem: the continued citation of retracted research. Multiple competing perceptions of this problem, retraction-related social stigma, and unclear lines of authority or responsibility for addressing it complicate stakeholder coordination. The RISRS project brought stakeholders together to reflect on the nature of the problem, divergent viewpoints, and areas of intersection. This participatory design process led to a series of recommendations delineating areas where there is broad agreement, where stakeholders might productively collaborate to implement proposed solutions and engage in further collaborative problem solving. As a case study, our work highlights the potential benefit of participatory design processes for addressing problems in scholarly publishing by building investment and cooperation amongst stakeholder groups. Given the nature of the problem, the core recommendations derived from our stakeholder-engaged process are a successful but limited outcome; further efforts will be needed to continue the design process. Some recommendations have already been taken up by professional groups for additional, iterative participatory design while other recommendations, such as stakeholder education on stewardship of the scholarly record, need an invested coalition.
In general, the scholarly publication ecosystem is composed of multiple overlapping communities with different evaluation systems addressing different audiences for a variety of purposes. Consequently, publication ethics cannot be addressed as a monolith. Although ethics in the publication process are often understood in terms of individual actions and professional decisions, no single individual and no single professional body alone, can fully address problems in this ecosystem. Consequently, sociotechnical approaches such as participatory design and frameworks like wicked problems may be beneficial for iteratively addressing challenges. Scholarly publishing professionals should become familiar with the diversity and complexity of issues in publication ethics, including the continued citation of retracted research. This is an important area to consider in the ethical education of publishing professionals.
Acknowledgements
Alfred P. Sloan Foundation G-2020-12623
CRediT
The RISRS Team: Yoss Arianlou: Investigation; Halle Burns: Investigation; Mary Terese Campbell: Project Administration; Yuanxi Fu: Investigation; Katherine Howell: Investigation; Tzu-Kun (Esther) Hsiao: Investigation; Randi Proescholdt: Conceptualization, Data Curation, Formal Analysis, Investigation, Methodology, Project Administration; Jodi Schneider: Conceptualization, Data Curation, Formal Analysis, Funding Acquisition, Investigation, Methodology, Project Administration, Resources, Supervision, Writing, Reviewing, & Editing; Will White: Investigation; Nathan D. Woods: Conceptualization, Data Curation, Formal Analysis, Investigation, Methodology, Project Administration, Writing, Reviewing, & Editing; Yee Yan ‘Vivien’ Yip: Investigation.
Thank you to all anonymous interviewees; to our collaborators, RISRS workshop attendees: IJsbrand Jan Aalbersberg, Elsa Alvaro, Michele Avissar-Whiting, Monya Baker, Caitlin Bakker, Joanne Berger, Lisa Bero, Elisabeth Bik, Geoff Bilder, Stephanie Boughton, Helena Cousijn, Jennifer Deal, Nicholas De Vito, Daniele Fanelli, Ashley Farley, Patricia Feeney, Annette Flanagin, Kathryn Funk, David Gillikin, CAPT Stephen Gonsalves, Josh Greenberg, Francesca Grifo, C.K. Gunsalus, Karen Hanus, Joerg Heber, Hannah Heckner, Tom Heyman, Kathryn Kaiser, Daniel T. Kulp, Stacey Lavelle, Christopher Lehmann, James Leung, Dmitry Malkov, Aaron Manka, Michael Markie, Kathrin McConnell, Alice Meadows, David Moher, Josh Nicholson, Ivan Oransky, Laura Paglione, Katrina Pickersgill, Deborah Poff, Jessica Polka, Sarah Robbie, Pamela Ronald, Bruce D. Rosenblum, Barbara Ruggeri, John Seguin, Eefke Smit, Elizabeth Suelzer, Sean Takats, Nicole Theis-Mahon, Randy Townsend; and to others who provided feedback on the RISRS report: Alison Avenell, Lex Bouter, Jennifer Byrne, Renee Hoch, Hervé Maisonneuve, and Iratxe Puebla.
For feedback on this paper, thank you to Kiel Gilleade.
[Author Bios]
Nathan D. Woods is a Postdoctoral Research Fellow at the University of Lethbridge. He is an interdisciplinary scholar, practicing anthropologist, and information professional working at the intersection of research and practice on issues related to the production, use, and stewardship of science, scholarship, and cultural memory. Working with multiple communities of practice, his larger research agenda considers the complex and dynamic relationships between knowledge, the design of institutions, and the organization of expert work. Ongoing projects explore the changing organization of scholarship; the science-policy interface; and the democratization of knowledge production. He holds a PhD in Anthropology from the CUNY Graduate Center; an MSLIS from the University of Illinois, Urbana-Champaign; and an interdisciplinary BA from the Evergreen State College. His work has been funded by the National Science Foundation and the American Philosophical Society.
Jodi Schneider is Assistant Professor at the School of Information Sciences, University of Illinois at Urbana-Champaign where she runs the Information Quality Lab. She studies the science of science through the lens of arguments, evidence, and persuasion with a special interest in controversies in science. Her recent work has focused on topics such as systematic review automation, semantic publication, and the citation of retracted papers. Interdisciplinarity (PhD in Informatics, MS Library & Information Science, MA Mathematics; BA Great Books/liberal arts) is a fundamental principle of her work. She has held research positions across the U.S. as well as in Ireland, England, France, and Chile. She leads the Alfred P. Sloan-funded project, Reducing the Inadvertent Spread of Retracted Science: Shaping a Research and Implementation Agenda.
The RISRS Team worked on the Alfred P. Sloan-funded project, Reducing the Inadvertent Spread of Retracted Science: Shaping a Research and Implementation Agenda in 2020-2022.
References
Aalbersberg, IJsbrand Jan, Christopher Lehmann, Jodi Schneider, and Elizabeth Suelzer. 2021. “A Cross-Industry Discussion on Retracted Research: Connecting the Dots for Shared Responsibility.” Presented at the 2021 Charleston Library Conference, Virtual and Charleston, SC, November 4. http://hdl.handle.net/2142/112777.
Abma, Tineke A., and Jacqueline E. W. Broerse. 2010. “Patient Participation as Dialogue: Setting Research Agendas.” Health Expectations 13 (2): 160–73. https://doi.org/10.1111/j.1369-7625.2009.00549.x.
Alberts, B., R. J. Cicerone, S. E. Fienberg, A. Kamb, M. McNutt, R. M. Nerem, R. Schekman, et al. 2015. “Self-Correction in Science at Work.” Science 348 (6242): 1420–22. https://doi.org/10.1126/science.aab3847.
Ancker, J. 2004. “Proceedings of the Retreat on The Journal’s Role in Scientific Misconduct: A Retreat by the Council of Science Editors with Funding from the Office of Research Integrity 7-9 November 2003 Lansdowne Resort and Conference Center, Lansdowne, Virginia.” Science Editor 27 (3): 75–85.
Anderson, Christopher. 1992a. “Congress Looks for Methods to Assess Clinical Research.” Nature 357 (6373): 5.
———. 1992b. “Bill Would Force Journals to Follow Misconduct Rules.” Nature 357 (6373): 7. https://doi.org/10.1038/357007b0.
Avissar-Whiting, Michele, Caitlin Bakker, Hannah Heckner, Sylvain Massip, Jodi Schneider, Randy Townsend, and Nathan D. Woods. 2021. “Addressing Disorder in Scholarly Communication: Strategies from NISO Plus 2021.” Information Services & Use Preprint (Preprint): 1–15. https://doi.org/10.3233/ISU-210113.
Azoulay, Pierre, Jeffrey L. Furman, Krieger, and Fiona Murray. 2015. “Retractions.” Review of Economics and Statistics 97 (5): 1118–36. https://doi.org/10.1162/REST_a_00469.
Bakker, Caitlin J, Jodi Schneider, Randy Townsend, Michele Avissar-Whiting, Hannah Heckner, Charles Letaillieur, and Sylvain Massip. 2021. “Misinformation and Truth: From Fake News to Retractions to Preprints.” In NISO Plus 2021. https://niso.cadmoremedia.com/Title/2a60b6a4-3050-4e41-993c-27b65c1acf48.
Balhara, Yatan Pal Singh, and Ashwani Mishra. 2014. “Compliance of Retraction Notices for Retracted Articles on Mental Disorders with COPE Guidelines on Retraction.” Current Science 107 (5): 757–60.
Baumgartner, Frank R, and Bryan D Jones. 2015. The Politics of Information: Problem Definition and the Course of Public Policy in America. University of Chicago Press.
Bilder, Geoffrey, Daniele Fanelli, and Jodi Schneider. 2021. “Seminar 2021: Reducing the Inadvertent Spread of Retracted Science: Taxonomy Considerations.” Presented at the COPE Seminar 2021, Virtual, September 29. https://publicationethics.org/resources/seminars-and-webinars/retractions-taxonomy.
Birt, Linda, Suzanne Scott, Debbie Cavers, Christine Campbell, and Fiona Walter. 2016. “Member Checking: A Tool to Enhance Trustworthiness or Merely a Nod to Validation?” Qualitative Health Research 26 (13): 1802–11. https://doi.org/10.1177/1049732316654870.
Bornemann-Cimenti, Helmar, Istvan S. Szilagyi, and Andreas Sandner-Kiesling. 2016. “Perpetuation of Retracted Publications Using the Example of the Scott S. Reuben Case: Incidences, Reasons and Possible Improvements.” Science and Engineering Ethics 22 (4): 1063–72. https://doi.org/10.1007/s11948-015-9680-y.
Brainard, Jeffrey. 2018. “Rethinking Retractions.” Science 362 (6413): 390–93. https://doi.org/10.1126/science.362.6413.390.
Braun, Virginia, and Victoria Clarke. 2006. “Using Thematic Analysis in Psychology.” Qualitative Research in Psychology 3 (2): 77–101. https://doi.org/10.1191/1478088706qp063oa.
Buchanan, Richard. 1992. “Wicked Problems in Design Thinking.” Design Issues 8 (2): 5–21. https://doi.org/10.2307/1511637.
Candela, Amber. 2019. “Exploring the Function of Member Checking.” The Qualitative Report 24 (3): 619–28. https://doi.org/10.46743/2160-3715/2019.3726.
Chen, Chaomei, Zhigang Hu, Jared Milbank, and Timothy Schultz. 2013. “A Visual Analytic Study of Retracted Articles in Scientific Literature.” Journal of the American Society for Information Science and Technology 64 (2): 234–53. https://doi.org/10.1002/asi.22755.
Collaborative Working Group from the conference “Keeping the Pool Clean: Prevention and Management of Misconduct Related Retractions.” 2018. “RePAIR Consensus Guidelines: Responsibilities of Publishers, Agencies, Institutions, and Researchers in Protecting the Integrity of the Research Record.” Research Integrity and Peer Review 3 (December): 15. https://doi.org/10.1186/s41073-018-0055-1.
Committee on Publication Ethics. n.d. “Introduction to Publication Ethics.” COPE: Committee on Publication Ethics. Accessed November 9, 2021. https://publicationethics.org/resources/elearning/introduction-publication-ethics-0.
COPE Council. 2019. “Retraction Guidelines.” http://doi.org/10.24318/cope.2019.1.4.
Culliton, Barbara J. 1988. “Random Audit of Papers Proposed: Audit, Conducted as Scientific Experiment, Could Provide Factual Evidence on Integrity of Published Papers.” Science 242 (4879): 657–58. https://doi.org/10.1126/science.3187510.
“Data Mining with Limited Access Text: National Forum.” n.d. Accessed November 14, 2019. https://publish.illinois.edu/limitedaccess-tdm/.
Davis, Philip M. 2012. “The Persistence of Error: A Study of Retracted Articles on the Internet and in Personal Libraries.” Journal of the Medical Library Association 100 (3): 184–89. https://doi.org/10.3163/1536-5050.100.3.008.
Dinh, Ly, Janina Sarol, Yi-Yun Cheng, Tzu-Kun Hsiao, Nikolaus Parulian, and Jodi Schneider. 2019. “Systematic Examination of Pre- and Post-Retraction Citations.” In Proceedings of the Association for Information Science and Technology, 56:390–94. https://doi.org/10.1002/pra2.35.
Dubin, David. 2004. “The Most Influential Paper Gerard Salton Never Wrote.” Library Trends 52 (4): 748–64.
Fanelli, Daniele. 2016. “Set up a ‘Self-Retraction’ System for Honest Errors.” Nature 531 (7595): 415. https://doi.org/10.1038/531415a.
Flanagin, Annette, Hannah Heckner, Deborah Poff, John Seguin, and Jodi Schneider. 2021. “A Cross-Industry Discussion on Retracted Research: Connecting the Dots for Shared Responsibility.” Presented at the Society for Scholarly Publishing 43rd Annual Meeting. http://hdl.handle.net/2142/110140.
Fletcher, Amber J., and Gregory P. Marchildon. 2014. “Using the Delphi Method for Qualitative, Participatory Action Research in Health Leadership.” International Journal of Qualitative Methods 13 (1): 1–18. https://doi.org/10.1177/160940691401300101.
Fu, Yuanxi, and Jodi Schneider. 2020. “Towards Knowledge Maintenance in Scientific Digital Libraries with the Keystone Framework.” In Proceedings of the ACM/IEEE Joint Conference on Digital Libraries in 2020 (JCDL ’20), 217–26. https://doi.org/10.1145/3383583.3398514.
Gray, Richard, Amal Al-Ghareeb, Jenny Davis, Lisa McKenna, and Stav Amichai Hillel. 2018. “Inclusion of Nursing Trials in Systematic Reviews after They Have Been Retracted: Does It Happen and What Should We Do?” International Journal of Nursing Studies 79 (March): 154. https://doi.org/10.1016/j.ijnurstu.2017.12.006.
Greco, Albert N., Robert M. Wharton, and Amy Brand. 2016. “Demographics of Scholarly Publishing and Communication Professionals.” Learned Publishing 29 (2): 97–101. https://doi.org/10.1002/leap.1017.
Hoffmann, Michael H. G. 2020. “Reflective Consensus Building on Wicked Problems with the Reflect! Platform.” Science and Engineering Ethics 26: 793–819. https://doi.org/10.1007/s11948-019-00132-0.
Horn, Robert E., and Robert P. Weber. 2007. “New Tools for Resolving Wicked Problems: Mess Mapping and Resolution Mapping Processes.” v. 1.2. Watertown, MA: Strategy Kinetics LLC. https://www.strategykinetics.com/New_Tools_For_Resolving_Wicked_Problems.pdf.
Hsiao, Tzu-Kun, and Jodi Schneider. 2021. “Continued Use of Retracted Papers: Temporal Trends in Citations and (Lack of) Awareness of Retractions Shown in Citation Contexts in Biomedicine.” Quantitative Science Studies. https://doi.org/10.1162/qss_a_00155.
Jentoft, Svein, and Ratana Chuenpagdee. 2009. “Fisheries and Coastal Governance as a Wicked Problem.” Marine Policy 33 (4): 553–60. https://doi.org/10.1016/j.marpol.2008.12.002.
Kingdon, John W, and Eric Stano. 1984. Agendas, Alternatives, and Public Policies. Boston: Little Brown.
Lewin, B. 1989. “Fraud and the Fabric of Science.” Cell 57 (5): 699–700. https://doi.org/10.1016/0092-8674(89)90781-2.
Lipmanowicz, Henri, and Keith McCandless. 2014. The Surprising Power of Liberating Structures: Simple Rules to Unleash a Culture of Innovation. Liberating Structures Press.
———. n.d. “Liberating Structures - 7. 15% Solutions.” Accessed November 16, 2021a. https://www.liberatingstructures.com/7-15-solutions/.
———. n.d. “Liberating Structures - 14. Min Specs.” Accessed November 16, 2021b. https://www.liberatingstructures.com/14-min-specs/.
———. n.d. “Liberating Structures - Introduction.” Accessed November 11, 2019c. http://www.liberatingstructures.com/.
Michael, Ann. 2017. “Ask the Chefs: How Can We Increase Diversity in Scholarly Communications?” The Scholarly Kitchen (blog). November 16, 2017. https://scholarlykitchen.sspnet.org/2017/11/16/diversity-scholarly-communications/.
Mingers, John, and Jonathan Rosenhead. 2004. “Problem Structuring Methods in Action.” European Journal of Operational Research 152 (3): 530–54. https://doi.org/10.1016/S0377-2217(03)00056-0.
Mott, Andrew, Caroline Fairhurst, and David Torgerson. 2019. “Assessing the Impact of Retraction on the Citation of Randomized Controlled Trial Reports: An Interrupted Time-Series Analysis.” Journal of Health Services Research & Policy 24 (1): 44–51. https://doi.org/10.1177/1355819618797965.
National Information Standards Organization. 2021. “NISO Voting Members Approve Work on Recommended Practice for Retracted Research.” September 21, 2021. https://www.niso.org/press-releases/2021/09/niso-voting-members-approve-work-recommended-practice-retracted-research.
Oransky, Ivan, and Jodi Schneider. 2021. “Reducing Retracted Science: Best Practice for the Editorial Office.” Presented at the ISMTE 2021 Global Virtual Event, Virtual, October 12.
Pantziarka, Pan, and Lydie Meheus. 2019. “Journal Retractions in Oncology: A Bibliometric Study.” Future Oncology 15 (31): 3597–3608. https://doi.org/10.2217/fon-2019-0233.
Piller, Charles. 2021. “Disgraced COVID-19 Studies Are Still Routinely Cited.” Science 371 (6527): 331–32. https://doi.org/10.1126/science.371.6527.331.
Proescholdt, Randi, Jodi Schneider, Yuanxi Fu, Nathan D. Woods, and Katherine Howell. 2021. Empirical Retraction Lit (version 2.20.0). https://doi.org/10.5281/zenodo.5498500.
Retraction Watch. 2021. “A New Milestone: There Are Now 30,000 Retractions in the Retraction Watch Database.” Tweet. @RetractionWatch (blog). September 10, 2021. https://twitter.com/RetractionWatch/status/1436119642028773380.
———. n.d. “Doing the Right Thing.” Retraction Watch. Accessed November 9, 2021. https://retractionwatch.com/category/by-reason-for-retraction/doing-the-right-thing/.
RISRS Bibliography v.1. 2021. InfoQualityLab. https://github.com/infoqualitylab/EPPI-Reviewer_to_Exhibit_JSON.
Rith, Chanpory, and Hugh Dubberly. 2007. “Why Horst W. J. Rittel Matters.” Design Issues 23 (1): 72–91. https://doi.org/10.1162/desi.2007.23.1.72.
Rittel, Horst W. J., and Melvin M. Webber. 1973. “Dilemmas in a General Theory of Planning.” Policy Sciences 4 (2): 155–69. https://doi.org/10.1007/BF01405730
Roberts, Nancy. 2000. “Wicked Problems and Network Approaches to Resolution.” International Public Management Review 1 (1): 1–19.
Rohrer, Julia. 2021. “A Self-Correcting Fallacy – Why Don’t Researchers Correct Their Own Errors in the Scientific Record?” Impact of Social Sciences (blog). April 13, 2021. https://blogs.lse.ac.uk/impactofsocialsciences/2021/04/13/a-self-correcting-fallacy-why-dont-researchers-correct-their-own-errors-in-the-scientific-record/.
Rosa, Aaron, Niklas Gudowsky, and Philine Warnke. 2018. “But Do They Deliver? Participatory Agenda Setting on the Test Bed.” European Journal of Futures Research 6 (1): 14. https://doi.org/10.1186/s40309-018-0143-y.
Rosenhead, Jonathan. 1996. “What’s the Problem? An Introduction to Problem Structuring Methods.” Interfaces 26 (6): 117–31. https://doi.org/10.1287/inte.26.6.117.
Sandor Namachchivaya, Beth. 2017. “IMLS Grant LG-73-17-0070-17: National Forum: Data Mining Research Using In-Copyright and Limited-Access Text Datasets: Shaping a Research and Implementation Agenda for Researchers, Libraries, and Content Providers.” Institute of Museum and Library Services. 2017. https://www.imls.gov/sites/default/files/grants/lg-73-17-0070-17/proposals/lg-73-17-0070-17-full-proposal-documents.pdf
Schmidt, Marion. 2018. “An Analysis of the Validity of Retraction Annotation in PubMed and the Web of Science.” Journal of the Association for Information Science and Technology 69 (2): 318–28. https://doi.org/10.1002/asi.23913.
Schneider, Jodi, Nathan D. Woods, Randi Proescholdt, Yuanxi Fu, and The RISRS Team. 2021a. “Reducing the Inadvertent Spread of Retracted Science: Shaping a Research and Implementation Agenda [Not Peer Reviewed][March Draft].” F1000Research, March, 10:211. https://doi.org/10.7490/f1000research.1118522.1.
———. 2021b. “Reducing the Inadvertent Spread of Retracted Science: Shaping a Research and Implementation Agenda [Not Peer-Reviewed][April Draft].” F1000Research 10 (April): 329. https://doi.org/10.7490/F1000RESEARCH.1118546.1.
———. 2021c. “Recommendations from the Reducing the Inadvertent Spread of Retracted Science: Shaping a Research and Implementation Agenda Project.” MetaArXiv. https://doi.org/10.31222/osf.io/ms579.
Schneider, Jodi, Di Ye, Alison M. Hill, and Ashley S. Whitehorn. 2020. “Continued Post-Retraction Citation of a Fraudulent Clinical Trial Report, 11 Years after It Was Retracted for Falsifying Data.” Scientometrics 125 (3): 2877–2913. https://doi.org/10.1007/s11192-020-03631-1.
Senseney, Megan, Eleanor Dickson Koehl, Beth Sandore Namachchivaya, and Bertram Ludäscher. 2021. “Transforming Library Services for Computational Research with Text Data: Environmental Scan, Stakeholder Perspectives, and Recommendations for Libraries: A Report from the IMLS National Forum on Data Mining Research Using In-Copyright and Limited-Access Text Datasets.” Chicago: Association of College and Research Libraries. https://www.ala.org/acrl/sites/ala.org.acrl/files/content/publications/whitepapers/TransformingLibServices.pdf.
Shapiro, M. F., and R. P. Charrow. 1989. “The Role of Data Audits in Detecting Scientific Misconduct. Results of the FDA Program.” JAMA 261 (17): 2505–11.
Sherman, John, and Gayle Peterson. 2009. “Finding the Win in Wicked Problems: Lessons from Evaluating Public Policy Advocacy.” The Foundation Review 1 (3): 87–99. https://doi.org/10.4087/FOUNDATIONREVIEW-D-09-00036.1.
Simkin, M. V., and V. P. Roychowdhury. 2005. “Stochastic Modeling of Citation Slips.” Scientometrics 62 (3): 367–84. https://doi.org/10.1007/s11192-005-0028-2.
Suelzer, Elizabeth M., Jennifer Deal, and Karen L. Hanus. 2020. “Challenges in Discovering the Retracted Status of an Article.” Discussion paper for Reducing the Inadvertent Spread of Retracted Science: Shaping a Research and Implementation Agenda. http://hdl.handle.net/2142/108367.
Suelzer, Elizabeth M., Jennifer Deal, Karen L. Hanus, Barbara Ruggeri, Rita Sieracki, and Elizabeth Witkowski. 2019. “Assessment of Citations of the Retracted Article by Wakefield et al with Fraudulent Claims of an Association between Vaccination and Autism.” JAMA Network Open 2 (11): e1915552. https://doi.org/10.1001/jamanetworkopen.2019.15552.
Suelzer, Elizabeth M., Jennifer Deal, Karen Hanus, Barbara E. Ruggeri, and Elizabeth Witkowski. 2021. “Challenges in Identifying the Retracted Status of an Article.” JAMA Network Open 4 (6): e2115648–e2115648. https://doi.org/10.1001/jamanetworkopen.2021.15648.
Taylor, Simone, Susan Spilka, Kristen Monahan, Isabel Mulhern, and Jeri Wachter. 2020. “Evaluating Equity in Scholarly Publishing.” Learned Publishing 33 (4): 353–67. https://doi.org/10.1002/leap.1301.
The Editors of The Lancet. 2015. “Correcting the Scientific Literature: Retraction and Republication.” The Lancet 385 (9966): 394. https://doi.org/10.1016/S0140-6736(15)60137-4.
“Top 10 Most Highly Cited Retracted Papers.” 2015. Retraction Watch (blog). December 28, 2015. https://retractionwatch.com/the-retraction-watch-leaderboard/top-10-most-highly-cited-retracted-papers/.
Topi, Heikki, Allen Tucker, and Allen Tucker, eds. 2014. “Sociotechnical Approaches to the Study of Information Systems.” In Computing Handbook : Information Systems and Information Technology. Chapman and Hall/CRC. https://doi.org/10.1201/b16768.
TRANSFORM Project. n.d. “Participatory Research Agenda Setting.” TRANSFORM (blog). Accessed November 9, 2021. https://www.transform-project.eu/citizen-engagement/participatory-research-agenda-setting/.
Vermaas, Pieter E., and Udo Pesch. 2020. “Revisiting Rittel and Webber’s Dilemmas: Designerly Thinking Against the Background of New Societal Distrust.” She Ji: The Journal of Design, Economics, and Innovation 6 (4): 530–45. https://doi.org/10.1016/j.sheji.2020.11.001.
Vet, Paul E. van der, and Harm Nijveen. 2016. “Propagation of Errors in Citation Networks: A Study Involving the Entire Citation Network of a Widely Cited Paper Published in, and Later Retracted from, the Journal Nature.” Research Integrity and Peer Review 1 (December): 3. https://doi.org/10.1186/s41073-016-0008-5.
Vorland, Colby J., Andrew W. Brown, Keisuke Ejima, Evan Mayo‐Wilson, Danny Valdez, and David B. Allison. 2020. “Toward Fulfilling the Aspirational Goal of Science as Self-Correcting: A Call for Editorial Courage and Diligence for Error Correction.” European Journal of Clinical Investigation 50 (2): e13190. https://doi.org/10.1111/eci.13190.
Wager, Elizabeth, Sabine Kleinert, Volker Bähr, Ksenija Bazdaric, Michael Farthing, Michele Garfinkel, Chris Graf, et al. 2021. “Cooperation & Liaison between Universities & Editors (CLUE): Recommendations on Best Practice.” Research Integrity and Peer Review 6 (April): 6. https://doi.org/10.1186/s41073-021-00109-3.
Weiss, Janet A. 1989. “The Powers of Problem Definition: The Case of Government Paperwork.” Policy Sciences 22 (2): 97–121. https://doi.org/10.1007/BF00141381.
Wetterer, James K. 2006. “Quotation Error, Citation Copying, and Ant Extinctions in Madeira.” Scientometrics 67 (3): 351–72. https://doi.org/10.1556/Scient.67.2006.3.2.
Wiedermann, Christian J. 2018. “Inaction over Retractions of Identified Fraudulent Publications: Ongoing Weakness in the System of Scientific Self-Correction.” Accountability in Research 25 (4): 239–53. https://doi.org/10.1080/08989621.2018.1450143.
Williams, Peter, and Elizabeth Wager. 2013. “Exploring Why and How Journal Editors Retract Articles: Findings from a Qualitative Study.” Science and Engineering Ethics 19 (1): 1–11. https://doi.org/10.1007/s11948-011-9292-0.
Wood, B. Dan, and Alesha Doan. 2003. “The Politics of Problem Definition: Applying and Testing Threshold Models.” American Journal of Political Science 47 (4): 640–53.
Woolf, Patricia K. 1987. “Ensuring Integrity in Biomedical Publication.” JAMA 258 (23): 3424–27. https://doi.org/10.1001/jama.1987.03400230084037.
Wright, Kath, and Catriona McDaid. 2011. “Reporting of Article Retractions in Bibliographic Databases and Online Journals.” Journal of the Medical Library Association 99 (2): 164–67. https://doi.org/10.3163/1536-5050.99.2.010.
A negative citation “disputes, corrects or questions, or negatively evaluates cited work” (Suelzer et al. 2019). ↑