Interview with Morwenna Rogers

By Ciara Keenan
PortraitThis article was originally posted on the Meta-evidence website on 9 May 2018.

This week we are delighted to welcome Morwenna Rogers to the meta-evidence blog. Morwenna is an information specialist within the evidence synthesis team at The National Institute for Health Research (NIHR) Collaboration for Leadership in Applied Health Research and Care (CLAHRC) South West Peninsula (PenCLAHRC). Morwenna was awarded a master’s in information management 20 years ago and is a qualified indexer. This wealth of experience is apparent in the many publications she has authored since.

Prior to joining PenCLAHRC in 2011, Morwenna has been a library manager at the Royal College of Psychiatrists and a medical information officer in the pharmaceutical industry. Morwenna has researched various search methods for systematic reviews, and is currently involved in a major research project, an Evidence and Gap Map conducted with the Campbell Collaboration on health, social care and technological interventions to improve functional ability of older adults.

Morwenna is one of the two information retrieval specialists who together with a reviewer develop and run the hugely popular Searching and Beyond workshops, which provide librarians and other information retrieval experts the opportunity to apply their skills to searching for systematic reviews.

With the advance of technology, the way we search for studies has been revolutionised. What are the challenges associated with this?

I feel very fortunate in that I came into the library and information world at a really exciting time; searching was being made a lot easier by massive providers like Dialog and Datastar, which were then accessed remotely through a dialup connection. This meant that I got to grips with text-word searching, using Boolean, search syntax and field codes from the get go, since pretty much all you started with was a blank screen. Since then of course, publishers and other hosts have developed forms allowing easy access to databases via websites, which has simplified the search process enormously. However, it also means that I am frequently frustrated by poor web interfaces that have not been designed to cope with systematic review searching or complex search syntax. It is not always clear what a database is doing when you enter an instruction. It feels like such a waste when a valuable database cannot be searched in an effective way.

The next big advance is the use of machine learning and automation to identify and screen studies. This is already used effectively to identify studies with a particular study type e.g. randomised controlled trials. However challenges remain where a review question is less clear cut or the study type not defined, or in areas of social science or psychology where abstracts are written in a less structured way.

What do you enjoy about Information Retrieval for Systematic Reviews?

I relish the challenge of trying to get that perfect balance between sensitivity (retrieving all the relevant studies) and specificity (keeping out the irrelevant studies). Usually in a systematic review, the more sensitive a search is, the less precise it is. As a very vague rule of thumb, I find that if 5-10% of the studies screened are included at the title and abstract stage, then that feels like a reasonable balance. As an information professional, I love grouping things together, so organising a research question into distinct concepts that can then be turned into individual search terms and phrases is the most fun part of my job. I probably spend a lot more time on this than I should! Each individual line of search is individually considered and crafted. To me, it’s like playing word games.

What is exciting to you in the field of Information Retrieval right now?

I like the idea of anything that saves time spent doing mundane tasks, such as document retrieval or screening. A good noticeable advance in the last couple of years was when our reference management software was updated so that it could search for and retrieve full text papers for us. The introduction of automation to take some of the load out of screening is also very welcome.

As more and more librarians and other information professionals get involved with systematic reviews, we are finding more ways to collaborate and work together carrying out our own research in search methods. This for me is very exciting!

Tell us what your first systematic review was about, and what do you wish you knew about searching then that you do now?

My first systematic review was about peer support for parents of children with neurodisabilities. The search was a bit of an arduous journey – it grew to about 50 lines long at one stage and then shrank back down to a handful of key terms after over-excessive (no doubt) testing from an enthusiastic novice. Just before publication a reviewer pointed out that one subheading wasn’t necessary as it was covered by the umbrella term above, and also that it looked like a far shorter search than was typical for a systematic review. I was mortified. By this time it was a good year since my searches were written, and I panicked briefly that I’d submitted the wrong one. Thankfully I hadn’t. It turned out that the search didn’t need to be hundreds of lines long, it’s just that there are not many terms for a parent, or for parent-to-parent support. I stuck to my guns, the reviewer was happy and so far (fingers crossed!) we have not found any studies that were missed.

I wish I’d known then to make a note of any decisions I made about my search strategy that may look strange to a reviewer, and also not to worry too much if the search doesn’t look like a typical search strategy, as long as it is functional and reproducible. I’ve also learned that searching databases is only part of the overall strategy. At the time I was convinced that the database search must find everything, but I have relaxed a bit since (although it is still galling to miss a study!). Citation chasing, hand-searching etc. are crucial safety nets and I think they should be considered as important as the database searches, rather than as ‘supplementary’ search methods.

What has been your experience with review authors, are you provided with enough information to construct the search strategy initially?

I am incredibly lucky to be part of the Evidence Synthesis Team within PenCLAHRC at Exeter Medical School. Information specialists are involved throughout the project and we are always co-authors. This means that we help write the methods part of a grant application (so often we are co-applicants too) and have carried out extensive scoping searches at this stage and during the drafting of the review protocol. Our team generally includes an expert in the field so I have plenty of input from the other authors about possible terms that I am not familiar with, for interventions etc. By the time that the actual review searches are designed and run, I am very familiar with the protocol, and criteria for inclusion and exclusion. Information specialists should be included all the way through a review – it’s how you get the best out of us.

Contact us