John Tukey said that the best thing about being a statistician is that you get to play in everyone’s backyard. It’s a great quote. While it more obviously reflects the tendency of applied statisticians (defined broadly) to be involved in lots of studies, it also captures something about statisticians that is often missed - that we are also scientists, delighted in the pursuit of new knowledge. I certainly see myself in this quote, seeing lots of science from my perches as panelist, reviewer, collaborator and co-investigator, and frequently finding myself experiencing a child’s wonderment over some cleverness or innovation on display.
Backyards are not all the same though. There are yards with tree-houses and tire swings. Swimming pools and putting greens. Beautiful gardens with deep, wicker basket chairs. But others are just just grass and a shrub. Overgrown patches of dust. Standing pools of oily water. Broken glass and rusty nails. Cinder blocks and cigarette butts. As statisticians we sometimes get to "play" in those yards too.
So what makes for a bad or broken study? Poor study designs, clumsy measurement, flawed analyses of the resulting data, and haphazard data management - all key drivers of research waste. These are the problems that will obviously catch the eye of a good statistician, and we can often help you prevent them if given the opportunity. Assuming you can find one of us to work with, that is.
But there are other bad backyards statisticians can also help you avoid, such as over-hyped research. After all, statistics is about uncertainty, while bullshit is shoveled with certitude. That’s why we say, "Don't tell me, show me. And then show me a few more times at least." And because we tend to work across so many studies, we see first hand that most interventions we test in rigorous studies don't seem to work. We see that most targets for drug discovery lead nowhere. The vast majority of clinical prediction models aren’t good enough to actually improve decision making. And so on. Thus it's in our nature to raise an eyebrow at any strong claims, because we understand and accept that failure is the norm, exactly as it should be on the cutting edge of scientific discovery.
But try telling that to a university press office. Tell that to a scientist talking into a journalist's microphone about their most recent publication. Tell that to the funder who spent the public's money on all these "failures". Everything is world class. Everything will greatly benefit the public and our patients. Ain’t that a statistical oddity? Every study, every scientist, and every university is in the top 1%. This hype puts at risk the precious trust the public place in us to be faithful conduits of our empirical findings. It distracts us from the importance of quality in method and process. And it can grossly distort what research gets funded in the first place, as we disproportionately pour money into decades-long hunts for evasive silver bullets by "superstar" scientists. So a tiny bit of skepticism is probably in order - and perhaps a statistician can help you find it. *Stares hard at journalists, politicians and research funders*
And finally, what happens when you see and play in a bunch of backyards? You get to see the neighborhood (as he wrung the very last drop of blood from this analogy). And with this birds-eye view you see that, just like in real life, the deficiencies of the yard aren't nearly as much about the yard owner as they are about structural deficits of the environment. For example, universities in Ireland are chronically underfunded. This forces us to compete with each other for external funding (and their external priorities), while spending money chasing deeply flawed university rankings in order to attract international students and the grossly inflated fees we charge them. What are we doing!? Meanwhile, actual research support services are often lacking. Researchers don’t have time to focus on research, especially those with clinical duties. Grant applications are done at the last minute. Calendars are on fire. And PhD students, post-doctoral researchers, and research assistants on low salaries and precarious contracts are expected to do much of actual day-to-day sciencing, often causing expensive research projects to stumble at the end when those people have to quickly jump onto the next opportunity. So of course research waste and over-hyped studies are commonplace. How could we expect anything else?
And where do statisticians fit in here? After all, it was the statistician Doug Altman who diagnosed a key root of these problems 30 years ago in the BMJ, laying much of the blame on incentives for researchers and institutions to maximize the quantity of publications at the expense of quality. He even gave us the cure - We need less research, better research, and research done for the right reasons - not that we listened.
Perhaps we should start?
Love, love, love. But how does this information reach the eyes and ears of health professionals who bring junk research to the coalface of practice? If only.
Excellent post.
"try telling that to a university press office. Tell that to a scientist talking into a journalist's microphone about their most recent publication".
That hits the nail on the head.