Ever hear that cinnamon is good for your health? Maybe you Googled it, turning up thousands of sources proclaiming the spice’s powers in lowering blood sugar and blood pressure and helping with a host of other conditions. The word “miraculous” would have shown up more than once.
After a thorough analysis of the existing research, a team of Connecticut pharmacists and medical professionals found something else: “Cinnamon is delicious but doesn’t have the heart protecting effects that you thought,” said C. Michael White, a University of Connecticut pharmacy professor and director of the UConn/Hartford Hospital Evidence-based Practice Center.
The federally funded center, one of 15 in the U.S. and Canada, offers a sort of reality check for scientific research. Its researchers have evaluated interventions ranging from Echinacea (effective in treating colds) to beta blockers in non-cardiac surgery patients (effective in reducing the chance of heart attacks, but with an increased risk of strokes).
More recently, the center produced a report on the effectiveness of human growth hormone in treating cystic fibrosis, which will be put into a “plain-language” guide to the topic published by the federal Agency for Healthcare Research and Quality, which funds the center.
While many studies look only at the benefits of a particular drug or intervention, White and his colleagues look at the balance of benefits and harms, how strong the evidence is, and whether the results of the studies are applicable to the real world.
The work is known as comparative effectiveness research, something that has been getting increased attention lately. The federal government devoted $1.1 billion to comparative effectiveness research as part of the stimulus act, and the health reform law established an institute for it.
The topic has proven controversial. In Britain and other countries, comparative effectiveness research drives how money is spent on health care, determining which services are covered by the government health care system, and which ones are not. Americans have expressed widespread support for using the research to gain information, according to a recent survey, but offered less support for using the results to allocate government resources or dictate treatment decisions.
The Evidence-based Practice Center’s work is not binding, and researchers are prohibited from taking cost into account in evaluating the merits of a particular treatment. The work leads to reports aimed at informing patients, doctors, insurers and the federal government.
The goal, say White and Co-Director Craig Coleman, is to help people make health decisions based on the full scientific literature – not on the last article they read, or the most recent high-profile study, or the research a pharmaceutical company representative handed them.
Dr. Jeffrey Kluger, a Hartford Hospital electrophysiologist and associate director of the Evidence-based Practice Center, said the work represents a shift in how doctors make decisions. Doctors once based relied largely on their own perceptions. Over time, he said, they began to look to small observational trials, then to multi-center clinical trials. Now, it’s systematic comparative effectiveness reviews that take into account far more evidence.
“This is the future,” he said.
Testing the Studies
The research topics originate as questions. Anyone can submit them; then the federal agency works with stakeholders to determine which would have the most impact.
Researchers go through consultations with experts and public comment before even finalizing the key questions to address. The biggest reports can take up to 16 months to complete.
The review looks at both the effects of the interventions being studied and at the studies themselves. How strong is the evidence? Was it tested in men and women and multiple ethnic groups and locations, or a narrow sliver of the population? And could the methods used actually work in the real world?
That was a key question in reviewing a clinical trial on preventing complications after open-heart surgeries. The group examined it before the center earned its federal designation in 2007.
The trial showed that people who took the drug amiodarone for 14 days before surgery had a decreased risk of atrial fibrillation after surgery. The UConn researchers asked cardiothoracic surgeons if they could apply the study’s findings to their patients.
In the real world, the doctors said, nobody has 14 days before surgery.
So White, Kluger and their colleagues designed their own trial, testing the intervention in real-world conditions.
“If you could do it here, then any of the other hospitals could do it,” White said.
The results: A protocol for using the drug to prevent complications that could be done for patients with varying levels of time before surgery.
By reviewing multiple studies, researchers can also identify patterns that were too small to detect in individual trials. At the Evidence-based Practice Center, doing so led to a key finding about beta blockers.
Beta blockers are known to have an effect in cardiac surgeries. Doctors had started using them in non-cardiac surgeries, and clinical trials showed a reduced risk of heart attacks. They also showed an increased risk of stroke, but in each one, the increase was very small.
The Evidence-based Practice Center analyzed multiple trials, involving more than 10,000 patients – and got a clearer picture: Beta blockers produced a decreased risk for heart attacks in non-cardiac surgeries, but a markedly increased risk of strokes, which tended to be severe.
Finding all the research to review on a given topic is not an exact science. And analyzing the results requires more than reading the published studies. There are often ambiguities in the reported results of clinical trials. “No difference” in a journal article might really mean no significant difference. Evidence-based Practice Center researchers try to reach the study authors for clarification and to find out what results they did not have space to publish.
They watch for publication bias – the tendency, particularly in industry-funded research, for studies with positive results to be published and for those with neutral or negative results not to be.
There are clues that point to research that did not get published. Trial registries list what researchers were looking for, and applications for drug approval are required to include a list of protocols. That can be useful in determining whether what scientists reported in publications was what they actually set out to look for.
The Evidence-based Practice Center’s results are not always popular. It’s hard to please everyone when the audience includes both people who could potentially benefit from a treatment and the insurers that would pay for it.
“There are vested interests on both sides of the issue so regardless of what you find there are going to be very powerful interests who are not going to like your findings,” White said.
The cinnamon study earned Coleman an angry e-mail from Sri Lankan cinnamon farmers – a consequence of the research that he accepts.
“The best thing you can do is to do it right, and to do it comprehensively,” Coleman said.