it was the best choice—or sometimes even that it worked at all.
“Experts were dissociated from the numbers,” he observes.
During his early training, Ioannidis says, he grew intrigued
with some pioneering work by the late Thomas Chalmers, former dean of the Mount Sinai School of Medicine and a researcher at Harvard and Tufts medical schools. As early as the 1970s,
Chalmers began advocating for large-scale, randomized clinical
trials. He also pioneered the field of meta-analysis, methods
that combine the results of multiple trials, which led, for example, to the more widespread use of clot-busting drugs in treating heart attacks.
Ioannidis laughs when he recalls how, early in his career,
doing meta-analysis meant an infuriatingly slow process of going
to the library, reading journals, and trying to compare researchers’
methods to see if different studies reinforced each other’s findings.
Today, powerful computer programs and the growing practice
of putting raw data from experiments in giant online databases
have streamlined the process considerably. He can even do
“meta-analyses of meta-analyses” and help large consortia of
researchers harmonize the design and methods of their studies
in advance so the information is more robust and comparable.
To put Ioannidis’s role in perspective, it might be helpful to
think of a given research investigator as a governor charged with
trying to effectively create a budget, manage infrastructure and
ROBYN T WOMEY
called percutaneous coronary intervention, usually involving
the insertion of tiny metal scaffolds called stents to prop arter-
ies open, did not reduce incidence of death or heart attacks in
Those two treatments “cost billions of dollars and supported the existence of entire specialties for many years,” Ioannidis
and his co-authors wrote in January in the Journal of the American Medical Association. Ioannidis says the data clearly show
that patients were subjected to risk with no chance of benefit.
While the number of prescriptions for combination hormone
therapy dropped 80 percent or more in the years after the WHI
study, the number of coronary interventions did not decline
nearly as dramatically following the COURAGE trial. “
Defenders of these therapies and interventions wrote rebuttals and
editorials and fought for their specialties, but the reality was
that the best that could be done was to abandon ship,” Ioannidis wrote in JAMA.
Those sound like fighting words. Yet Ioannidis, 46, is a soft-spoken academic whose personal style belies the startling conclusions and impact of his work. “The purpose of my research is
not necessarily to be challenging or killing sacred cows,” he
says. “I’m not very interested in showing that one particular
research paper is wrong, or you did it wrong and I’m correct. My
penchant is to look at the big picture, to look at hundreds of
thousands of associations.” And while Ioannidis advocates for
rigorous review of all existing treatments, his principal aim is to
improve research design and remove bias so that ineffective
treatments never enter practice to begin with.
Ioannidis’s current work stems from his deep love of math
and statistics. He was born in New York City to physician par-
ents but raised in Athens, Greece, where he excelled at math
from a young age. He attended the University of Athens Medi-
cal School, added a PhD in biopathology, and later trained at
Harvard and Tufts and joined the National Institutes of Health,
where he worked on pivotal HIV research. These days, although
he often collaborates on the design of specific studies, what he
mostly does is meta-research, or the study of studies. Using
powerful number-crunching programs and constantly evolving
algorithms, Ioannidis analyzes many trials, each with many
patients. He’s working to see not so much whether one treat-
ment works or does not work, or whether one association of a
specific risk factor with one disease is true or false, but wheth-
er factors related to the research process—the number of
patients tested, the criteria for including data, statistical errors
in an analysis, even fraud or financial incentives—may have
compromised the data and conclusions. He burst on the medi-
cal establishment radar in 2005 with a paper in PLoS Medicine
asserting nothing less than: “Why Most Published Research
Findings Are False.”
Ioannidis says he began to realize in medical school that a lot
of what he was being taught was grounded not in hard data and
evenly applied criteria, but rather in the instincts and habits of
practitioners. Time and again he would be taught the standard of
care for a given diagnosis, and yet be unable to find evidence that
S T A N F O R D 57 TRUE OR FALSE: Ioannidis looks at all the data.
run a state. Ioannidis is like an
economist for the federal government who evaluates the combined performance of all the states’ efforts to grow and thrive,
comparing differences, figuring out what works and doesn’t, and
what the resulting impact is on the national economy.
Right now, for example, he and his colleagues are research-
ing the published results of thousands of different treatments