Evidence Summary and Synthesis: a Summary
I hope I’m right in saying that when asked to ‘do a search’ for someone our default is usually to ask a few questions of the enquirer (if they haven’t already run away to get on with something else), jot down some key terms, quickly switch HDAS on and get on with formulating a strategy so that HDAS can generate a lovely list of references we can then email to whoever asked. We hope they appreciated it but they have since gone about their day and we’re left wondering if anyone ever opened the email – then the next search comes in and we start the process all over again…
I came into the Searching and Synthesis course from a slightly different perspective. The types of requests we receive from colleagues in HEE are less suited to this way of searching and the answers rarely come from HDAS. Quite often I start with Google to look for titbits in reports and board papers that, though lower down the ‘evidence hierarchy’, might just be what our enquirer is looking for. Going in I was anxious, I wasn’t sure I could do ‘proper searches’ using masterfully honed strategies that reaped endless systematic reviews and RCTs. And, after I’d struggled through the search, would I be able to ‘take a breath’ as John suggested, and actually look at the results and paint some form of coherent picture of what they found?
The first session helped me think about how I organise the results I find as I’m searching. I think most of those who attended found the ‘table tool’ useful. It seems simple but organising results into a matrix detailing citation information and notes from the abstract really did help me organise my thoughts. You can download results from HDAS into Word or Excel but they are not tidily arranged so if you’re doing this factor in time for cleaning up the results. This of course takes longer if the results cannot be downloaded from HDAS as they have been plucked from all over the web – something I raised in the last follow up session. However, on the whole I found this a really helpful way to organise my results and prepare them for the second phase –summary/ synthesis.
The second session felt like much more familiar ground, we looked at other useful sources where commissioners/ managers might expect the answers to be (the Health Foundation, the King’s Fund or the HSJ for news updates).
However, it was the follow-up sessions (three half mornings, plus a mailing list) where I think I gained the most practical experience and where the most ideas were shared. Each session had an accompanying practical exercise that we could all have a go at then talk about at the session. The exercises were stepped;
- in the first we were given a set of results and asked to ‘write a review’
- in the second we were again a set of results to review but, to make it a little harder, the results had contradictory findings so we had to be more inventive in our write up
- and finally we were given a query and had to do the search and write the review.
I gained a lot from this opportunity, because we all conducted the same search and compared reviews in the follow up sessions there was an element of peer-review it is normally hard to achieve. Seeing how others had gone about theming and presenting results was for me the most helpful part of the course.
I did think we missed an opportunity to do what I’m coining a “HDASless search” and review though –if the evidence managers and commissioners need to make decisions is not always found in the databases it would’ve been nice to have a go at a review where the results couldn’t really be organised by level of evidence, or that were from disparate sources with no abstracts.[i] These results inevitably take longer to summarise and a bit of digging might be required to really pick out the useful bits. I think going forward this is something that we should revisit if we’re going to have an impact from ‘bedside to boardroom’.
The general consensus from the groups I attended was:
- This isn’t appropriate for all searches or for everyone who has a query (a bit a judgment on our part is needed to select the occasions where this could be impactful and add value)
- Putting the reviews together is incredibly time consuming (the last search and review took at least 8 hours, others in the group reported longer). I’m sure we’d get a bit quicker with practice but this re-enforces the first point about it not being realistic for every search
- This process definitely hones our skills – it encourages us to actually get to grips with the material and put our synthesis hats on to create a useful, brief review
- The results were aesthetically pleasing – people used tables, headings, colour(!) and everyone agreed it made for a much nicer final result that could be packaged and branded as a LKS ‘product’
We also had some discussions about who the audience for reviews would be and decided that that would be entirely dependent on the organisation. We would each know best when, and for whom, these reviews could work. In the last activity it was easy to forget who the audience was and what they were actually asking. I struggled to get to grips with the final search question and lots of us reinforced the importance of the ‘reference interview’ in a real search scenario. After all this is not Line of Duty – we need not be ‘one rank senior’ to probe a little further to uncover what our colleague really wants to know, rather than what they originally asked for when they collared us in the corridor.
Some other questions the feedback sessions raised?
- How important are levels of evidence? They are obviously important, but how important are they in this scenario? It’s possible we’re dismissing, or not paying enough attention to, potentially useful results because they’re at the wrong end of the trusty evidence pyramid
- Where does critical appraisal fit into all this? The reviews take so long it may be necessary to push the critical appraisal ball back into the courts of health professionals or at least make it clear we have not appraised the material
- Just what is the difference between synthesis and summary? I’m not sure we really had an answer for this. I’m proposing 42.
- Is there any merit to shared searching and reviewing? There could be scope in tackling searches together to get a higher quality, almost peer-reviewed, final product
- How do embed this into practice? Is it realistic to embed this into practice? I think the answer to this will depend on each team; their capacity to deliver this to their members and their belief that it will add value to their service offer
- What are the next steps? There may be a way we can all re-group and have another practice to maintain the skills and keep the discussion going through the already existing Clinical Librarians’ group
- How do we better distribute these summaries? We need to think about duplication of work and remember the ‘do once and share’ motto if we are going to invest so much resource into this
- How do we overcome our insecurities as non-clinical folk? I think after an early crisis of confidence I came to the conclusion that I could only review the information in front of me, in the words it was written. Transparency is key – we can only be clear about what we’ve done, or not done (i.e. critical appraisal), ultimately it is for the health professional to unpick what the evidence means in practice
- How do we tackle those HDASless searches and reviews? It’s hard for me to ignore the wealth of resources that are not housed in the databases that could enrich these reviews, even if they do muddy the waters and make them trickier to do
For me the course was a good start to what I’m sure will be an ongoing conversation about this and I look forward to hearing from the other synthesisers (thanks John). The mailing list (email@example.com) is a good place to start There have already been some interesting discussions about RAG ratings, disclaimers, critical appraisal and how best to tackle these reviews so why not join the conversation and share some of your thoughts and tips?
Health Education England, working across the North West
 Or is it Synthesis?
 John Gale’s blog about the course https://lihnnclinicallibs.wordpress.com/2017/05/12/evidence-synthesis-going-beyond-the-reference-list-by-john-gale/
[i] Anne Gray shared an interesting article about commissioners and the evidence they’re looking for: https://www.nihr.ac.uk/blogs/evidence-based-policy-making-the-view-from-a-commissioner/6045