Fishy Science
Implications of the retraction of a recent paper for scientific integrity in the White House
Recently, the Proceedings of the National Academy of Sciences (PNAS) retracted a highly influential paper on marine protected areas and fishing due to identification of significant errors that undercut the paper’s results as well as significant irregularities in the peer review process. What makes this particular retraction of unusual interest is that the irregularities in the PNAS peer review process involve Dr. Jane Lubchenco, the White House official who is currently overseeing President Biden’s Scientific Integrity Task Force.
The paper, A global network of marine protected areas for food (Cabral et al. 2020, hereafter C20), was published by PNAS in October 2020, and has been highly reported on due to its perceived policy relevance. Dr. Lubchenco served as its editor for PNAS. That means that she was responsible for overseeing the paper’s journey through the peer review process, including the selection of reviewers. We now know that Dr. Lubchenco violated PNAS guidelines for conflict of interest, and not unknowingly or in a small way.
The details matter here, so I am going to explain them.
The issues in peer review of C20 at PNAS are not subtle. The authors of C20 included seven researchers who Dr. Lubchenco was collaborating with on a different paper — Sala et al. 2021 (published in Nature, hereafter S21) — that built upon the results of C20. Even though C20 was published first (26 Oct 2020 vs 17 Mar 2021), S21 was actually submitted three weeks prior to C20 (17 Dec 2019 vs. 6 Jan 2020). So at the time that Dr. Lubchenco assumed the role of editor for C20, she had just submitted a different paper with seven authors of C20 that built upon the S21 results — S21 thus depended upon the successful publication of C20.
Already, this is an egregious violation of scientific integrity. It gets worse. One of Dr. Lubchenco’s co-authors on S21 who was also a co-author of S20 — the paper she was editing for PNAS — was her brother-in-law. These various conflicts were called to the attention of PNAS in April 2021 by Dr. Magnus Johnson, of the University of Hull in the UK, prompting an investigation.
The PNAS guidelines are completely clear (emphasis added):
A competing interest due to a personal association arises if you are asked to serve as editor or reviewer of a manuscript whose authors include a person with whom you had an association, such as a thesis advisor (or advisee), postdoctoral mentor (or mentee), or coauthor of a paper, within the last 48 months. When such a competing interest arises, you may not serve as editor or reviewer.
A competing interest due to personal association also arises if you are asked to serve as editor or reviewer of a manuscript whose authors include a person with whom you have a family relationship, such as a spouse, domestic partner, or parent–child relationship. When such a competing interest arises, you may not serve as editor or reviewer.
While Dr. Lubchenco should not have edited C20, to be completely fair to her, the “competing interest due to a personal association” guideline does not appear to be much enforced by PNAS. I was able to quickly identify multiple violations of this guideline via a simple search — here are one, two, three examples — the third of which also involves Dr. Lubchenco as editor. If PNAS retracts papers that violate its competing interests guidelines, they will no doubt find a rather large set of papers.
There is more to the story.
On November 17, 2020 Dr. Lubchenco testified before the House Natural Resources Committee, in support of Congressional legislation to establish protected areas in marine ecosystems, relying on C20. She did not disclose that she had shepherded the paper through peer review, nor did she disclose that she was a collaborator in the research. The unavoidable impression that this sequence of events gives is the creation of “policy-based evidence” — that is, evidence that is created for the explicit purpose of supporting particular policy or political outcomes, like the passing of legislation.
The impression of “policy-based evidence” is further supported by the fact that the failures of the peer review process in this instance are not just procedural, they are substantive as well. It turns out that the science of C20 is also fatally flawed. In an excellent and comprehensive post, Max Mossler goes into detail on the errors of C20, how they were identified, and how they also call into question the validity of S21 — which has also received an incredible amount of media and policy attention. Here I’ll just report his excellent bottom line:
Regardless of any conflict of interest, the science in both Cabral et al. and Sala et al. is critically flawed, but being used to advocate for public policy. Both follow a recent trend of publishing predictions that use a limited set of assumptions (in a very uncertain world) to produce global maps that get published in high-profile journals and garner considerable media and political attention.
Computer models are essential tools for science and management, but the accuracy of their predictions depends on both the quality of the data and the assumptions they are based on. Often, a problem is so complex that several assumptions may be equally plausible; readers need to be made aware when different assumptions lead to vastly different outcomes.
The Cabral et al. and Sala et al. papers disregard uncertainty in favor of set values for their model parameters. They don’t account for the enormous uncertainty in these parameters and don’t provide strong evidence that their choice of values was correct. The assumptions and parameters produce big headlines, but are fundamentally unhelpful for the future of ocean governance and sustainability. We expect policy-makers and resource managers to make decisions based on the best available science. Inconsistent and unrealistic assumptions are not that.
And if all of that is not bad enough, it still gets worse. S21 reports (inaccurately) that its projections are based on the IPCC SRES A2 scenario — which for anyone who knows anything about climate scenarios, would have been an incredibly odd choice, not least because that scenario is more than 20 years old and rarely (if ever) used in research today. It turns out (if you dig deep enough) that C20 and S21 are in fact not based on the IPCC SRES A2 scenario, but instead on the implausible RCP8.5. That the authors don’t know the difference between A2 and RCP8.5 is itself problematic. That RCP8.5 is being used to generate predictions for use in marine/fisheries policy is even more problematic.
So we have quite a mess here. Going forward, here are some recommendations:
Nature should immediately evaluate S21 for retraction, as it is based on C20, which is now retracted. It is difficult to see how S21 can stand unretracted.
PNAS properly retracted C20, but the journal should also do a comprehensive audit to assess the extent of other violations of its conflict of interest guidelines. A cursory look suggests that such violations are not uncommon.
Given Dr. Lubchenco’s significant violations of PNAS policies to publish flawed research, and then using that flawed research to advocate for policy, the White House should reconsider her leadership role in its Scientific Integrity Task Force. Otherwise, it would be fair to ask if scientific integrity guidelines are optional, depending on your politics.
This episode provides good news and bad news. The good news is that it underscores that science is indeed self-correcting, even if that process takes a while. In the long run, better science defeats bad science. The bad news is that in the short term, leadership and institutions failed. This episode of fishy science is not over yet — PNAS, Nature and the White House still have important roles to play in ensuring scientific integrity. Watch this space.
What a shame dishonesty and science hacks have failed their mission and likely benefited more than just by publications. Honesty, though essential to a successful society, has been lost! In the name of honesty and science purge the dishonest from science!
Jane was properly penalized for her human mistake, i.e., being overly enthusiast about MPAs to cut some ethical corners. But Stev, I'd be careful about throwing stones, as most people live in glass houses. Fortunately, science is more self-correcting than other disciplines:
Vadas, R.L. Jr. 1994. The anatomy of an ecological controversy: honey bee searching
behavior. Oikos 69: 158-166 (https://www.beesource.com/threads/the-anatomy-of-an-ecological-controversy-honey-bee-searching-behavior.365460).
And MPAs are valid mgmt. tools, though we need to stay honest & transparent about such research.
-Bob V.