Evidence Synthesis Methods Interest Group

 View Only
last person joined: yesterday 

ACRL
  • 1.  Recommended Screening Tools (especially for non-health science evidence synthesis)

    Posted Jan 31, 2025 11:46 AM

    Hi All,

    I suddenly have the opportunity for work to pay to get me personal access to several screening software programs for testing purposes towards a potential future institutional subscription. I'm looking for advice on which tools I should test out (Covidence, EPPI Reviewer, LaserAI, DistillerSR, etc). I was going to review all these nice and carefully over the next few months to select which ones I even wanted to test (plus design my test protocols), but now I suddenly have to make this decision very quickly. So I am seeking some quick assistance! For context, my university has very little health sciences. The folks who want this software and want to do evidence synthesis are in social work, environmental evidence, agriculture, animal studies (we do have veterinary medicine), education, computer science, psychology, etc. If you have any unique insight into why some tool or other is better for these non-health audiences, I'm genuinely all ears.

    Thanks in advance!

    Regards,

    Anna



    ------------------------------
    Anna Ferri, MLIS, MEd
    pronouns: she/her(s)
    Assistant Professor | Evidence Synthesis Librarian
    Colorado State University Libraries
    P: 970-491-1146 | anna.ferri@colostate.edu
    1201 Center Avenue Mall | Fort Collins, CO 80523
    ------------------------------


  • 2.  RE: Recommended Screening Tools (especially for non-health science evidence synthesis)

    Posted Feb 03, 2025 11:05 AM

    Hi Anna,

    Many of the projects I've been helping on recently have been related to education (special education, specifically) and agriculture. I personally find that Rayyan.ai works well for those who need it. I also like Covidence, but Rayyan is much more cost-effective for my current faculty and student population. I pay for the upgraded version, which provides more features than the basic version. 

    However, with all that said, I also like EPPI-Reviewer, especially for education. The level of support I provide really changes which software I use or suggest. But as far as what I find the most user-friendly - I like Rayyan and Covidence. You also can save your citation files and not upload them into Zotero or EndNote to deduplicate in either of those (paid version of Rayyan), which I like.

    I think each one you listed above have their pros and cons. 

    I hope this helps a little, good luck!



    ------------------------------
    Megan York
    Education Librarian
    University of Arkansas
    She/Her/Hers
    ------------------------------



  • 3.  RE: Recommended Screening Tools (especially for non-health science evidence synthesis)

    Posted Feb 03, 2025 11:09 AM
    Hi Anna!

    One you might consider is Sysrev. This is something we've been using for most of our review work at CMU for the past few years. It is relatively under the radar because it is a very small start-up and I don't think they've done much to invest in getting into the academic market.

    We like it for a few main reasons:
    1. It is very flexible, so you can create very customizable review workflows, and not be so pinned into the standard 'systematic review' approach like you would be with Covidence and others. This works great for us since many of our users are also not in the health sciences and are open to using different approaches.
    2. It is built on FAIR principles, such that the data generated from the review process are readily accessible for computational work outside of the platform. As a comparison, as much as we also like Covidence, we have found it to be very limited in terms of what data you can get out of it (for example, you lose data about record level conflicts as soon as conflicts are resolved). As we have been moving to using more LLM-based screening approaches, this becomes more of a problem.
    3. It now has a built-in LLM feature that we have found quite effective. Again, I think since we have many people doing lit review or ES projects that aren't standard systematic reviews, folks are wanting to experiment with LLMs in their workflow, and so Sysrev is making that possible without programming or coding skills needed.
    A couple significant downsides include the fact that there is very little up to date documentation and there are still the occasional bugs and quirks, which they tend to be fairly quick to address.  We ended up building our own documentation (almost complete and available here) for our users. 

    Also, there are a couple features that simply no longer work but are still visible....they are working on reinstating these or removing them. Most importantly, their machine learning model (aka prioritized screening feature) currently doesn't work. They have prioritized getting this back online in the next few weeks.

    Full disclosure, we have been working closely with this company to test the product in an academic enterprise context, so I might be a bit biased! 

    Happy to answer other questions about it if you have them. I would love to know your opinion if you do try it out!

    Sarah





  • 4.  RE: Recommended Screening Tools (especially for non-health science evidence synthesis)

    Posted Feb 03, 2025 12:44 PM

    Hi Anna - at the U. of Minnesota we predominantly use Covidence and really like it, and we've used it for all the disciplines you mention. The one drawback other than cost, IMO, is the lack of the ability to randomly assign records to pairs of reviewers, or to generate a random sample of records for training/piloting. It lacks some of the other features other programs have, but that simplicity may be a bonus, IMO, when training teams to use it. EPPI Reviewer does those things, and more, but it's so powerful that it's difficult to learn/navigate, IMO.  Rayyan has improved recently, but that said, I still hate it, at least the free version. I'm currently using it as a screener and keep hitting paywalls for features that would make it more useful. I also find that it's fairly buggy - for example, I might filter to records from a certain journal, but after n number of records (where n is inconsistent) I suddenly stop seeing records until I go back and refilter. And if you want to check multiple choices within a filter, I have to re-scroll to that filter, check a box, get booted to the top, scroll down, expand, check the next one, rinse-and-repeat. (Your experience may vary, as it works for Meagan, but it may be the difference between free and paid).

    Best,

    Scott



    ------------------------------
    Scott Marsalis
    Social Sciences Librarian
    University of Minnesota
    He/Him/His
    ------------------------------