Evidence Synthesis Methods Interest Group

 View Only
last person joined: 3 days ago 

Charge: To promote and develop competencies around evidence synthesis including systematic reviews, meta-analyses, scoping reviews, and other related methods of research synthesis, through activities such as: Facilitating discussion and peer-support; Creating and managing a resource page; Encouraging programming and publications around systematic reviews through ACRL.
Community members can post as a new Discussion or email ALA-acrlesmig@ConnectedCommunity.org
Before you post: please note job postings are prohibited on ALA Connect. Please see the Code of Conduct for more information.
Expand all | Collapse all

Collected Responses: Recommended Screening Tools (especially for non-health science evidence synthesis)

  • 1.  Collected Responses: Recommended Screening Tools (especially for non-health science evidence synthesis)

    Posted Feb 07, 2025 11:40 AM

    Hi All,

    I am deeply grateful for all the responses to my query! It helped immensely. You are all just amazing!

    Below are the collected responses that I received in this listserv and other spaces to the following request that I made for information:

    "I suddenly have the opportunity for work to pay to get me personal access to several screening software programs for testing purposes towards a potential future institutional subscription. I'm looking for advice on which tools I should test out (Covidence, EPPI Reviewer, LaserAI, DistillerSR, etc). I was going to review all these nice and carefully over the next few months to select which ones I even wanted to test (plus design my test protocols), but now I suddenly have to make this decision very quickly. So I am seeking some quick assistance! For context, my university has very little health sciences. The folks who want this software and want to do evidence synthesis are in social work, environmental evidence, agriculture, animal studies (we do have veterinary medicine), education, computer science, psychology, etc. If you have any unique insight into why some tool or other is better for these non-health audiences, I'm genuinely all ears."

    I have included identifying information only when the response I received was made broadly to this listserv. If the response was made to me directly or in another space, then I have anonymized it.

    Megan York, Education Librarian, University of Arkansas

    Many of the projects I've been helping on recently have been related to education (special education, specifically) and agriculture. I personally find that Rayyan.ai works well for those who need it. I also like Covidence, but Rayyan is much more cost-effective for my current faculty and student population. I pay for the upgraded version, which provides more features than the basic version. 

    However, with all that said, I also like EPPI-Reviewer, especially for education. The level of support I provide really changes which software I use or suggest. But as far as what I find the most user-friendly - I like Rayyan and Covidence. You also can save your citation files and not upload them into Zotero or EndNote to deduplicate in either of those (paid version of Rayyan), which I like.

    I think each one you listed above have their pros and cons. 

    Sarah Young, Carnegie Melon University

    One you might consider is Sysrev. This is something we've been using for most of our review work at CMU for the past few years. It is relatively under the radar because it is a very small start-up and I don't think they've done much to invest in getting into the academic market.

    We like it for a few main reasons:

    1. It is very flexible, so you can create very customizable review workflows, and not be so pinned into the standard 'systematic review' approach like you would be with Covidence and others. This works great for us since many of our users are also not in the health sciences and are open to using different approaches.
    2. It is built on FAIR principles, such that the data generated from the review process are readily accessible for computational work outside of the platform. As a comparison, as much as we also like Covidence, we have found it to be very limited in terms of what data you can get out of it (for example, you lose data about record level conflicts as soon as conflicts are resolved). As we have been moving to using more LLM-based screening approaches, this becomes more of a problem.
    3. It now has a built-in LLM feature that we have found quite effective. Again, I think since we have many people doing lit review or ES projects that aren't standard systematic reviews, folks are wanting to experiment with LLMs in their workflow, and so Sysrev is making that possible without programming or coding skills needed.

    A couple significant downsides include the fact that there is very little up to date documentation and there are still the occasional bugs and quirks, which they tend to be fairly quick to address.  We ended up building our own documentation (almost complete and available here) for our users. 

    Also, there are a couple features that simply no longer work but are still visible....they are working on reinstating these or removing them. Most importantly, their machine learning model (aka prioritized screening feature) currently doesn't work. They have prioritized getting this back online in the next few weeks. 

    Full disclosure, we have been working closely with this company to test the product in an academic enterprise context, so I might be a bit biased! 

    Happy to answer other questions about it if you have them. I would love to know your opinion if you do try it out!

    Scott Marsalis, Social Sciences Librarian, University of Minnesota

    At the U. of Minnesota we predominantly use Covidence and really like it, and we've used it for all the disciplines you mention. The one drawback other than cost, IMO, is the lack of the ability to randomly assign records to pairs of reviewers, or to generate a random sample of records for training/piloting. It lacks some of the other features other programs have, but that simplicity may be a bonus, IMO, when training teams to use it. EPPI Reviewer does those things, and more, but it's so powerful that it's difficult to learn/navigate, IMO.  Rayyan has improved recently, but that said, I still hate it, at least the free version. I'm currently using it as a screener and keep hitting paywalls for features that would make it more useful. I also find that it's fairly buggy - for example, I might filter to records from a certain journal, but after n number of records (where n is inconsistent) I suddenly stop seeing records until I go back and refilter. And if you want to check multiple choices within a filter, I have to re-scroll to that filter, check a box, get booted to the top, scroll down, expand, check the next one, rinse-and-repeat. (Your experience may vary, as it works for Meagan, but it may be the difference between free and paid).

    Anonymous

    I realize you're perhaps more interested in non-health sciences perspectives, and most of my evidence synthesis work is with folks in nursing.  But I will say that their projects and questions tend to be less like systematic reviews of intervention effectiveness (i.e., painstakingly extracting quantitative data from RCTs), and are often broader, like scoping reviews or integrative reviews. Based on those experiences, I can see Covidence being useful for folks outside the health sciences. 

    Strengths:

    • I think the interface is pretty intuitive and helps novices get their heads around the process of screening for evidence synthesis projects
    • Bulk full text upload works reasonably well, if you have access to EndNote
    • The data extraction forms are totally customizable, so appropriate for non-health reviews
    • I like how Covidence addresses the issue of studification by allowing you to bundle multiple records into one study.
    • I've seen a few hiccups or bugs crop up now and again, and their customer service folks are usually really responsive
    • Who knows where they're going next with AI/machine learning stuff, but I think their current approach is reasonable and useful - they use machine learning for their relevancy ranking, but authors ultimately still have to lay human eyes on all the records to make a decision.

    Weakness:

    • The way the newest PRISMA format has you report 'other sources' is not totally aligned with the way Covidence tracks the data for the PRISMA diagram. So it's been a bit annoying to separate exclusion reasons and screening decisions for snowballed sources from the sources that came from the database searches.
    • I've had teams who want to pilot and norm their screening process with all the reviewers on the team, but Covidence only allows up to 2 reviewers for screening decisions.  So we've had to find ways to work around that for piloting.

    Covidence is the only screening tool I've really used extensively, so I can't really speak to comparisons with other tools.

    Anonymous {social media}

    My experiences using screening software (mainly Covidence) has been solely for medical topics. It might be interesting to see a comparison of the software you cite in a "non-medical" source; for example, you list education among the disciplines, and "A Systematic Narrative Review of Screening Tools for Conducting Systematic Reviews in Educational Research" includes "7 tools have been used by educational researchers, including Abstrackr, Covidence, ASReview, RevMan, Rayyan, EPPI-Reviewer, and DistillerSR" and "we present a decision tree to assist educational systematic reviewers in identifying suitable tools" https://eric.ed.gov/?id=ED656758 [Note: I locate full text access for this conference presentation here]

    Regards,

    Anna



    ------------------------------
    Anna Ferri, MLIS, MEd
    pronouns: she/her(s)
    Assistant Professor | Evidence Synthesis Librarian
    Colorado State University Libraries
    P: 970-491-1146 | anna.ferri@colostate.edu
    1201 Center Avenue Mall | Fort Collins, CO 80523
    ------------------------------


  • 2.  RE: Collected Responses: Recommended Screening Tools (especially for non-health science evidence synthesis)

    Posted Feb 13, 2025 03:05 PM

    Thanks for sharing that Anna.  I have a follow-up question for the group about Covidence based on one of the answers.  We just got an institutional license for Covidence so I haven't had the opportunity to use it in a while. The deduplication feature used to work fairly poorly. Has it improved? Do you still recommend deduplicating using EndNote, or can Covidence be trusted to handle that now? Thanks.



    ------------------------------
    Laurel Scheinfeld
    Health Sciences Librarian
    Stony Brook University Health Sciences Library
    ------------------------------