Is it safe for you to say ‘yes’ to be a research guinea pig?
When University of Maryland School of Medicine in the US researchers were looking for volunteers to test a vaccine for Ebola, which was killing thousands of people in West Africa, Andrea Buchwald raised her hand in Baltimore.
“Scientific curiosity,” the 29-year-old graduate research assistant in Maryland’s department of epidemiology said, explaining why – along with her trust in the system governing treatment of human subjects – she was willing to be experimented on.
“Consent for clinical trials is a very stringent process,” Buchwald added. “You’re expected to do your best to ensure your participants are fully informed and doing this of their own volition. Things have changed a lot since the 1940s.”
It was during that decade that hundreds of Guatemalans were infected with sexually transmitted diseases in the name of research – a horrific reminder of practices brought back to light last month when the Johns Hopkins University was sued for US$1bil (RM3.7bil) by research subjects and their families for its role in approving federal funds for the study.
Hopkins officials said the university didn’t develop or oversee the study and was not responsible.
The days when researchers used impoverished populations, prisoners, prostitutes, orphans and others as human guinea pigs are largely in the past, most would agree. In the most infamous case – the Tuskegee study that ran from the 1930s to the 1970s – black Alabama men with syphilis were left untreated so researchers could trace the terrible progression of the disease.
But even today, concerns arise periodically about the use of human subjects in clinical trials, especially as research institutions and pharmaceutical companies increasingly go abroad to test new drugs and vaccines in countries where oversight can be more lax.
In India, for example, a rash of reports several years ago of people dying during clinical trials, or being enrolled without proper consent, led to an uproar and a government crackdown on what had become a booming industry for the fast-developing country.
While some say Indian officials overreacted, the events reflect the continuing unease with human experimentation.
“I think there is lingering fear and suspicion of research in many quarters,” said Dr Daniel Kuritzkes, a Harvard virologist who had three research studies in India interrupted by the government’s scramble to enact new regulations.
“That’s unfortunate because for the most part, there has been worldwide adaptation of laws governing how human subjects are protected in research,” said Kuritzkes, who chairs the AIDS Clinical Trials Group, a National Institutes of Health programme that conducts research in conjunction with institutions around the world. “Things are done very differently than in the past.”
It is hard to imagine studies such as those in Tuskegee or Guatemala happening today, and indeed, the outrage over them led to ever-stricter safeguards to protect human subjects.
That the position of research participant advocate even exists today is testament to the change in how experiments are conducted. These days, there is much greater scrutiny of research proposals involving human subjects.
Researchers who receive NIH funding, for example, must get the approval of their organisation’s Institutional Review Board, which determines whether a proposed study protects human subjects, properly weighs the risks and benefits to them and can document that they provided informed consent.
At Hopkins Medicine alone, there are six such boards that meet on its Baltimore campuses weekly for three hours apiece to handle the volume of research.
The boards approve about 1,800 new protocols a year and oversee about 6,100 trials, according to Hopkins.
But the boards operate in private – Hopkins officials would not allow reporters to observe a meeting for this article – so it can be difficult to assess their work.
“There are tremendous amounts of variability,” said Laura Stark, a professor at Vanderbilt University who wrote the 2011 book Behind Closed Doors: IRBs and the Making of Ethical Research.
“The ethical thing to do in one system may not be considered ethical in another system.”
The University of Minnesota, for example, announced this month that it is overhauling its review process amid criticism of its psychiatric research.
Ethicists have pointed to incidents such as the suicide of a schizophrenic man in 2004 while he was enrolled in a drug trial, questioning whether someone so disturbed could even give informed consent.
The Institutional Review Board system has its origins in a medical past when doctors had much freer rein. For example, she writes, in the 1940s and 1950s, Mennonites, Quakers and other religious objectors to war were put in service to their country as research subjects for the NIH.
Some of them were marooned on what is now Roosevelt Island in New York so scientists could study the minimum amount of food and water shipwreck victims might need, Stark writes.
Once the NIH started funding more research off its own campus, officials realised they would need a way to make sure those institutions followed certain standards – and to limit their own liability should something go wrong.
“Basically, the review system got tied to money: If you wanted the money, you had to have a review process,” Stark said. “Medical research is big money: Who’s paying when there’s a lawsuit?”
The research community has long acknowledged the need to protect human subjects. Officials began enacting laws and regulations for researchers receiving federal dollars in 1948 with the Nuremberg Code, establishing the idea of consent.
It was a response to German physicians who experimented on prisoners during World War II.
Then, in 1974, the US passed the National Research Act to codify protections for research subjects. That led to the landmark Belmont Report, which spelled out principles of ethical treatment.

When certain vaccines and treatments are tested, illiteracy in certain countries make it difficult to obtain consent from those who volunteer as test subjects, says Dr Chris Plowe. Photo: TNS
Dr Christopher V. Plowe, a malaria researcher, said the rules were a “strong and appropriate reaction to Tuskegee, among others”.
A lot of current researchers began their careers after the rules were put in place and know no other way, said Plowe, the new director of the University of Maryland School of Medicine’s Institute for Global Health. Many researchers, like him, have even voluntarily strengthened the consent process overseas to address lingering distrust and ensure a study’s integrity.
Plowe said, for example, there are places where malaria, Ebola and cholera vaccines and treatments are tested, but illiteracy makes it difficult to obtain consent.
He said researchers increasingly start by trying to gain the trust of village elders or others with influence, a process called “community permission to enter”.
In most cases, he said, researchers shouldn’t use disadvantaged populations overseas for studies that don’t benefit them. And there are other situations, such as with refugees, that require more scrutiny because potential subjects may feel coerced.
“We want to eliminate malaria in Myanmar, but there has to be community buy-in and political will,” said Plowe, who is working with local health professionals, doctors, the government and even military officials. “We can sit down and talk about health issues. Malaria, everyone agrees, is something we’d like to get rid of.”
For all the improvement in protections for human subjects, there are those who say that laws and regulations have failed to keep up with changes in medicine and research.
Seema K. Shah, head of the NIH unit on international research ethics, said the last revision of regulations stemming from the research law came in 1991, before researchers used social media or could map the human genome, both of which raise privacy questions that remain unaddressed.
Japan, by contrast, revises its regulations every five years, said Shah, who is also on the faculty in the NIH Clinical Center Department of Bioethics.
“The goal in creating laws and ethical norms is to prevent the scandals of the past from happening in the future,” Shah said. “A lot of studies in the past prompted concern from the public and led to changes. But since there is no big crisis now, maybe some things aren’t being put into law.” – The Baltimore Sun/Tribune News Service