Hi all,
Good morning! Thank you for forwarding, Amy! Very interesting read.
Finally, I have a good elevator pitch about what I do: "a systematic review-considered the gold standard method to synthesize multiple studies on a topic and extract a broader conclusion."
I also think this headline (and the overall sentiment throughout this piece) gets it 100% backwards and is - like most headlines - extremely exaggerated. Research integrity, falsification, fabrication are all huge challenges. But this is also nothing new. And rather than being an existential threat to systematic reviews, is it not a reason why we need systematic reviews more than ever? While the author's description of systematic reviews quoted above is nice and concise, it misses a huge part of evidence synthesis: quality assessment. They even mention, for example, INSPECT-SR and REAPPRAISED checklists. We already have people working on this and developing methods for separating out the chaff from the grain! But somehow they fumble it in the conclusion and the headline because it sounds hard.
Yes, research integrity is an important issue and fake papers are a huge problem.
Yes, it is complicated and it is hard to sort out good papers (and AI will make it harder to tell the difference).
Yes, we need better tools in evidence synthesis to deal with these challenges.
No, this does not mean systematic reviews are in peril. The methodology can evolve in response to changes and challenges just like it always has!
Does anyone have experience with the two tools I mentioned earlier - INSPECT-SR (Cochrane) and REAPPRAISED - or other. . . what to call them. . . research integrity appraisal tools? I don't know much on this topic, but apparently I should learn more.
Best,
Eric
Eric Toole (he/him) | Evidence Synthesis Librarian
Science & Engineering Library
University of Massachusetts, Amherst
(413) 545-6151