How about a systematic review that writes itself?

Guy Tsafnat, me, Paul Glasziou and Enrico Coiera have written an editorial for the BMJ on the automation of systematic reviews. I helped a bit, but the clever analogy with the ticking machines from Player Piano fell out of Guy’s brain.

In the editorial, we covered the state-of-the-art in automating specific tasks in the process of synthesising clinical evidence. The basic problem with systematic reviews is that we waste a lot of time and effort in trying to re-do systematic reviews when new evidence becomes available – and in a lot of cases, systematic reviews are out-of-date nearly as soon as they are published.

The solution – using an analogy from Kurt Vonnegut’s Player Piano, which is a dystopian science fiction novel in which ticking automata are able to replicate the actions of a human after observing them – is to replace the standalone systematic reviews with dynamically and automatically updated reviews that change when new evidence is available.

At the press of a button.

The proposal is that after developing the rigorous protocol for a systematic review (something that is already done), we should have enough tech so that clinicians can simply find the review they want, press a button, and have the most recent evidence synthesised in silico. The existing protocols determine which studies are included and how they are analysed. The aim is to dramatically improve the efficiency of systematic reviews and improve their clinical utility by providing the best evidence to clinicians whenever they need it.

G Tsafnat, AG Dunn, P Glasziou, E Coiera (2013) The Automation of Systematic Reviews, BMJ 346:f139

Dealing with industry’s influence on clinical evidence

I co-wrote a piece for The Conversation about a new article that was published in the Cochrane Database of Systematic Reviews, written by Andreas Lundh and other luminaries from the research area. The authors showed that industry sponsored clinical trials more often report positive outcomes and fewer harmful side effects.

The most interesting result from the article was that the biases that make industry funded clinical trials more likely to produce positive results could not be accounted for  using the standard tools that measure bias. This is amazing because it gives us a strong hint that industry is an independent source of heterogeneity in the systematic reviews that include them.

Too bad it’s the 12th of the 12th 2012 and the world is about to end. We won’t have time to sort it out.

(Feature image from AAP Image/Joe Castro via The Conversation – click the link)