educating about biomedical research ethics
TRANSCRIPT
SCIENTIFIC CONTRIBUTION
Educating about biomedical research ethics
Bratislav Stankovic • Mirjana Stankovic
� Springer Science+Business Media Dordrecht 2014
Abstract This article examines the global and worsening
problem of research misconduct as it relates to bio-medico-
legal education. While research misconduct has serious
legal implications, few adequate legal remedies exist to
deal with it. With respect to teaching, research ethics
education should be mandatory for biomedical students and
physicians. Although teaching alone will not prevent mis-
conduct, it promotes integrity, accountability, and respon-
sibility in research. Policies and law enforcement should
send a clear message that researchers should adhere to the
highest standards of ethics in research. It is vital that
researchers and physicians understand basic aspects of law
and the legal system in order to develop understanding of
the medico-legal issues not just in the legal context, but
with a sound grounding in ethics, social and theoretical
contexts so that they can practice good medicine. Routine
and holistic research ethics education across the curriculum
for medical students and resident physicians, and contin-
uing medical education for practicing doctors, are probably
the best ways to accomplish this goal.
Keywords Research ethics � Scientific misconduct �Research misconduct � Teaching ethics � Responsible
conduct of research
Introduction
Bad science anywhere is bad for science everywhere. The
growth and development of interest in research ethics
extends over the past 50 years and has often been spurred
by public concern about various events and issues rearing
their ugly heads in scientific research; as well, the volume
of research misconduct may be increasing (Schrag 2008;
Cossins 2012; Steen et al. 2013; Hvistendahl 2013). Yet
within the traditional biomedical curriculum, the teaching
of law and ethics has often been both scarce and eclectic,
even though the practice of good medicine inevitably raises
legal and ethical issues and demands an understanding of
both (Manson 2008). Biomedical students and researchers
receive little ethics training. Those trained in law who may
choose to stray into biomedicine have received profes-
sional ethics training, but that is largely limited to the
sphere of ethical issues in legal practice.
The importance of introducing ethics into the medical
curriculum has been established by a number of national
medical institutions and committees. The ethics of medical
education has been previously discussed elsewhere,
although it is frequently limited to issues surrounding
research with human subjects or patient care (Jagsi and
Lehmann 2004). The common barriers to biomedical ethics
instruction appear to be (i) the lack of time within the
curriculum, and (ii) the lack of qualified teachers (Claudot
et al. 2007).
Honestly seeking truth when conducting research should
not be contrived as a moral rule; indeed, it is a necessary
condition for work to qualify as scientific. There are
numerous ways in which the knowledge that researchers
and physicians have about medical ethics, bioethics,
research ethics, and law can have a more direct impact on
both biomedical research and the type and quality of care
B. Stankovic (&)
University of Information Science and Technology ‘‘St. Paul the
Apostle’’, Partizanska bb, 6000 Ohrid, Macedonia
e-mail: [email protected]
M. Stankovic
Ministry of Education and Science, Sv. Kiril i Metodij 54,
1000 Skopje, Macedonia
e-mail: [email protected]
123
Med Health Care and Philos
DOI 10.1007/s11019-014-9561-1
that the patient receives. In order to provide medical care in
an ethical and humane way, physicians need to be better
educated about specific aspects of ethical medical practice
and learn to think critically about the increasingly complex
world of biomedical research and practice.
The issue of breach of research conduct serves as a
contact point between the institutions of a scientific com-
munity and the institutions of law. Legal institutions are
beginning to exert significant influence upon the practice of
science, including medicine. Through the medium of
research misconduct,1 the institutions of science are being
driven toward a more rigid and formal structure reflecting
that of legal institutions. The result of this profound and
foreign formalization will likely alter the conduct of
research—and the practice of medicine—forever.
Bad science: examples of biomedical research
misconduct
Perhaps as an attestation to a peculiar human character
trait, acts of faked or misrepresented scientific and tech-
nical data, falsified professional credentials, misidentifi-
cation of authorship, and plagiarism are neither twentieth-
century inventions, nor are they limited to any one research
field or national research system (LaFollette 2000). The
achievements of some scientific greats have been scruti-
nized in modern times because of suspicions that they were
obtained in less than honest ways.2 Even though biomed-
icine has attracted much of the attention in discussions
about misconduct, historical examples of research mis-
conduct cover a wide range—from the natural science via
polar exploration to the humanities (Riis 2001). Yet the
risks associated with research misconduct are increasing; in
addition to reputational damage, they can be pecuniary as
well. Indeed, major public harm in this context will likely
be manifested precisely through the channels of biomedical
research. A recent example is the fraudulent Wakefield
study linking childhood vaccine and autism (Deer 2011),
the negative implications of which were tremendous: sci-
entists and organizations across the world have spent a
great deal of time and money refuting the results of a minor
paper in the Lancet and exposing the scientific fraud that
formed the basis of the paper. The systematic failures
which permitted the Wakefield fraud were discussed by
Opel et al. (2011). Fake astronomical data will have less
chance of harming anyone, so some fields would be nec-
essarily less affected by research misconduct (Bouville
2008).
The legal problems associated with responsible conduct
of research (RCR) can be studied from cases that received
significant publicity, perhaps because of the accompanying
involvement of researchers affiliated with highly reputable
institutions (Broad and Wade 1982). The few colorful and
memorable cases illustrated below present useful examples
in defining the elusive problem of research misconduct,
highlighting its seriousness and the potential for damage
that betrayers of the truth can create in various fields of
scientific inquiry. They also show that cheaters can go
far—very far. For example, in an extraordinary case in the
1970s, Elias Alsabti managed to temporarily carry out a
successful career as an oncology researcher by misrepre-
senting his credentials and plagiarizing papers published by
other scientists (Broad 1980). He managed to build a career
as an impostor, working as a cancer specialist for various
American research institutions, moving on when his utter
lack of knowledge and understanding was noticed. Alsabti
is estimated to have published 50–60 plagiarized articles in
a few years, often with co-authors who have never pub-
lished with anyone but him, which led to the suspicion that
they may not exist (Weiss et al. 2001). It was also dis-
covered in 1980 that Dr. John Long, who studied Hodg-
kin’s disease at the Massachusetts General Hospital,
fabricated experimental data. Impatient to publish, he made
up numbers that looked right and published findings from
‘‘human’’ cell lines that were contaminated with monkey
cells (Kulynych 1998). Also in the 1980s, John Darsee of
Harvard University was caught red-handed while fabri-
cating raw data in his cardiology experiments. After
compiling an impressive list of publications in reputable
scientific journals, Darsee was found to have fabricated
data for tens of his publications. More than 80 of his arti-
cles were subsequently retracted (Wallis 1983).3 In 1988,
Stephen Breuning, a research psychologist with the Uni-
versity of Pittsburgh, pled guilty to federal charges for
falsifying his research results on the effects of treating
hyperactive children with the drugs Ritalin and Dexedrine.4
At that time, Breuning’s legacy in the field of mental
retardation was substantial. Between 1979 and 1984,
approximately one-third of the literature in the field was
produced by him (Brand et al. 1987). William Summerlin,
a dermatologist at the Sloan-Kettering Institute for Cancer
Research, essentially created patchwork mice; he painted
1 For the purposes of this article, the phrases ‘research fraud’,
‘research misconduct’, ‘scientific fraud’ and ‘scientific misconduct’
are used interchangeably. For the purposes of this article, ‘researcher’
and ‘scientist’ are treated as synonyms.2 The observations of pea plants made by Gregor Mendel, the ‘‘father
of genetics’’, were too good to be true. Mendel’s results have been
diplomatically explained by commentators in terms of ‘‘occasional
subconscious errors in favor of expectation’’ or ‘‘Mendel must have
been deceived by some assistant who knew too well what was
expected’’. See Stankovic (2004).
3 See 48 Fed. Reg. 30764 (1983).4 United States v. Breuning, No. K-88-0135 (D.Md. Sept. 19, 1988).
B. Stankovic, M. Stankovic
123
laboratory mice to fake results of skin graft experiments
(Hixson 1976). In the 1990s, The Thereza Imanishi-Kari
(TIK) case involved a dispute over data presented in an
article published in the journal Cell by Thereza Imanishi-
Kari, the Nobel laureate David Baltimore, and others
(Parrish 1998). The TIK case received extraordinary pub-
licity and resulted in over a decade-long investigation and
four Congressional hearings, exposing numerous flaws in
the process of investigation of research misconduct, rang-
ing from influence of the scientific and political commu-
nities to missteps by poorly trained staff involved in the
investigation. In 1999, Kimon Angelides of the Baylor
College of Medicine was found guilty of intentionally
falsifying data and misrepresenting research results in five
published research papers and in grant applications sub-
mitted to the NIH (Dalton 1997). In 2003, ‘‘[i]n a case
more befitting Sherlock Holmes,’’ an article published in
the New England Journal of Medicine, and offering hope to
patients with hypertrophic cardiomyopathy, found itself at
the center of a forgery row. Names and falsified signatures
of several famous cardiologists were used in a successful
attempt to bolster credibility of the article (David 2003). In
2005, Seoul University investigator Woo Suk Hwang and
24 co-authors published what appeared to be a ground-
breaking paper in Science in which they claimed to have
established eleven embryonic stem cell lines containing
nuclear DNA from somatic cells of research subjects
(Hwang et al. 2005). If the research had been sound, it
would have been one of the most important developments
in biomedicine in the twenty-first century, bringing money
and glory to South Korea, and could have earned Hwang a
Nobel Prize. However, due to fraudulent manipulation of
the data, the research became known as one of the biggest
scientific disappointments in this century, creating a set-
back in the field of embryonic stem cell research (Resnik
et al. 2006).
Research misconduct is neither limited exclusively to
the medical field, nor is solely associated with academia.
Notable examples of misconduct have been discovered
with research performed in the private biomedical sector.
In the mid-1990s, Dr. Robert Fiddes was a well-known and
respected clinician at the Southern California Research
Institute, and the lead clinical investigator on over 170
clinical trials, where he oversaw the testing of new drugs
on patients. At the time, Fiddes was a wonder boy in the
lucrative business of drug testing, paid well to test new
drugs, known for his ability to get and keep patients, and
for generating thorough results, thus maintaining his suc-
cessful practice with lies and fraud for a number of years
(Eichenwald and Kolata 1999). He was eventually caught,
and in 1997 pled guilty to fraud charges and was sentenced
to 15 months in prison (Swaminathan and Avery 2012). In
the late 1990s, executives of SMLX Technologies made
false statements to the Food and Drug Administration
(FDA) related to HIV saliva kits. In 2001, after the fraud
was discovered, the company was ordered to pay $197,500
in restitution and $150,000 as fine. Two executives were
sentenced to fifteen months in jail for mail fraud and for
making false statements to the FDA (Hasty 2001). Con-
ducting research ethically and honestly is not only a reg-
ulatory concern, but a business one as well.
These and other similar cases of misconduct had rip-
pling effects in the scientific community and renewed the
questions of research honesty, responsibility, and over-
sight. While the examples give some inkling of the variety
of research misconduct that has received attention within
the academic and industrial scientific community, the
technical complexity of the factual situations such mis-
conduct entails, they also highlight the potential for sig-
nificant damage in the affected field of study. The rate of
research misconduct has been estimated to be in the low
single digits of percentages (1–2 % of researchers per year
based on based on confirmed cases of misconduct in fed-
erally funded research and self-reports of misconduct on
anonymous surveys). However, admission and perception
may be different from reality, and the numbers may be
substantially higher. A pooled weighted average of 1.97 %
of scientists admitted to have fabricated, falsified or mod-
ified data or results at least once—a serious form of mis-
conduct by any standard—and up to 33.7 % admitted other
questionable research practices. In surveys asking about the
behavior of colleagues, admission rates were 14.12 % for
falsification, and up to 72 % for other questionable
research practices (Shamoo and Resnik 2009; Fanelli
2009). Due to the heterogeneity of acts, various methods to
discriminate between different levels of misconduct have
been suggested (Greenbaum 2009). In the eyes of the
malefactors, the risk of getting caught is probably out-
weighed by the otherwise unattainable prospects of pro-
jecting stellar academic achievements and acquiring
tenure.
Research ethics codes
Given the importance of ethics for both research conduct
and medical practice, different professional associations,
government agencies, and universities have adopted spe-
cific codes, rules, and policies relating to research ethics.
The conduct of clinical research is also reflected in the
development and adoption of international ethical codes,
such as the Nuremberg Code of 1947, the Helsinki Dec-
laration of 1964, the Council of International Organization
for Medical Science of 1993, and the Uniform Require-
ments for Manuscripts Submitted to Biomedical Journals of
2006. In the United States (USA), an alphabet soup of
Biomedical research ethics
123
government agencies such as the National Institutes of
Health, the National Science Foundation, the Food and
Drug Administration Agency, the Environmental Protec-
tion Agency, and the US Department of Agriculture have
ethics rules for funded researchers.
Many institutions require training in human research
ethics or animal research ethics (for those using humans or
animals in research) even if they do not mandate education
in responsible conduct of research (misconduct, authorship
etc.).
The following is a rough and general summary of some
ethical principles that various research ethics codes address
(adapted from Shamoo and Resnik 2009):
(i) Honesty: Honest reporting of data, results, meth-
ods, procedures, and publication status. No
fabrication, falsification, or misrepresentation of
data. No deception of colleagues, granting agen-
cies, or the public;
(ii) Objectivity: Avoidance of bias in experimental
design, data analysis, data interpretation, peer
review, personnel decisions, grant writing, expert
testimony, and other aspects of research where
objectivity is expected or required. Disclosure of
potential conflicts, personal or financial interests
that may affect research outcome;
(iii) Integrity: Keeping promises and agreements,
acting with sincerity, striving for consistency of
thought and action;
(iv) Carefulness. Avoidance of careless errors and
negligence. Careful and critical examination of
own work and the work of peers. Keeping good
records of research activities, data, research
design, and correspondence with agencies or
journals;
(v) Openness: Sharing data, results, ideas, tools,
resources;
(vi) Respect for intellectual property: Honoring pat-
ents, trademarks, copyrights, trade secrets, and
other forms of intellectual property. Giving
proper acknowledgement and credit for all con-
tributions to research. Zero tolerance for
plagiarism;
(vii) Confidentiality. Protection of confidential com-
munications, papers or grants submitted for
publication, personnel records, trade or military
secrets, and patient records;
(viii) Responsible publication: Publication designed to
advance research and scholarship, not to advance
just own career. Avoidance of wasteful and
duplicative publication;
(ix) Responsible mentoring: Helping educate, mentor,
and advise students and junior researchers;
(x) Respect for colleagues: Fair and equal treatment
of colleagues;
(xi) Social responsibility: Striving to promote social
good and through research, public education,
outreach, and advocacy;
(xii) Non-discrimination: Avoidance of discrimination
against colleagues or students on the basis of sex,
race, ethnicity, or other factors unrelated to their
scientific competence and integrity;
(xiii) Competence: Maintenance and improvement of
own professional competence and expertise
through lifelong education and learning;
(xiv) Legality: Knowledge and obedience of relevant
laws, institutional and governmental policies and
regulations;
(xv) Animal care: Exercise of proper respect and care
for animals when using them in research;
(xvi) Protection of human subjects: Minimized harms
and risks. Respect for human dignity, privacy,
and autonomy. Taking special precautions with
vulnerable populations. Fair distribution of the
benefits and burdens of research.
Although RCR codes, policies, regulations and princi-
ples are important and useful, like any set of rules, they do
not cover every situation; they often conflict, and they may
require considerable interpretation. It is therefore important
for biomedical students and researchers to learn how to
interpret, assess, and apply various research rules and how
to make decisions and to act in various situations. While
the majority of decisions involve the straightforward
application of ethical rules, in developing fields, new codes
are being established. For example, the patentability of
human embryonic stem cells was challenged in both USA
and European Union (EU) courts; the normal ‘‘standard of
care’’ against which new interventions are tested in medical
research has not been formally defined. A new, proactive
research ethics must be constantly adapting to the
advancing frontiers of biotechnology, and must also be
concerned with the great inequalities in global health
(Benatar and Singer 2000).
There are many other activities that may or may not be
defined under the rubric of ‘‘misconduct’’ but which can be
still regarded by most researchers as unethical. These are
generally called ‘‘deviations from acceptable research
practices’’ and include: (i) publishing the same paper in two
different journals without telling the editors; (ii) submitting
the same paper to different journals without telling the edi-
tors; (iii) not informing a collaborator of the intent to file a
patent; (iv) including a colleague as an author on a paper even
though the colleague did not make a serious contribution to
the paper; (v) discussing with colleagues confidential data
from someone else’s submitted, unpublished paper; (vi)
B. Stankovic, M. Stankovic
123
trimming outliers from a data set without discussing reasons;
(vii) using an inappropriate statistical technique in order to
enhance the significance of the data; (viii) bypassing the peer
review process and announcing own results through a press
conference without giving peers adequate information to
review the work; (ix) conducting a selective review of the
literature that fails to acknowledge the contributions of other
people in the field or relevant prior work; (x) stretching the
truth on a grant application in order to convince reviewers
that the project will make a significant contribution to the
field; (xi) stretching the truth on a job application or curric-
ulum vita; (xii) giving the same research project to two or
more students in order to create competition; (xiii) over-
working, neglecting, or exploiting (post-)graduate students
or post-doctoral researchers; (xiv) failing to keep good
research records; (xv) making derogatory comments and
personal attacks in a review of submitted manuscript; (xvi)
promising a student a better grade in return for favors,
including sexual favors; (xvii) using a racist epithet in the
laboratory; (xviii) making significant deviations from the
research protocol approved by the institution’s Institutional
Review Board or Animal Care and Use Committee without
telling the board or the committee; (xix) not reporting an
adverse event in a human research experiment; (xx) wasting
animals in research; (xxi) exposing students and staff to
biological risks in violation of biosafety rules; (xxii) reject-
ing a manuscript for publication without justifiable reason or
even without reading it; (xxiii) sabotaging someone’s work;
(xxiv) rigging an experiment so as to tailor its outcome; (xxv)
making unauthorized copies of data, papers, or computer
programs; (xxvi) owning (in the USA) over $10,000 in stock
in a company that sponsors one’s research and not disclosing
this financial interest; (xxvii) deliberately overestimating the
clinical significance of a new drug in order to obtain eco-
nomic benefits (Resnik 2011).
Whose problem?
All stakeholders have ethical responsibilities to address
research misconduct. Indeed, honestly seeking truth should
not be contrived as a moral rule—it is instead a condition
sine qua non for work to qualify as scientific. The question
is how to identify the most effective modus of improving
the relative respect for the ethics codes. Positive induce-
ments and reinforcements achieve little in this context.
People respond to positive inducements not rationally but
through their conditioned instincts. These are not applica-
ble in the context of scientific misconduct, because the
prospects of individual awards may be more attractive than
the good of the scientific community.
While many scientists are conscientious and strive to
ensure high quality of their research while maintaining
high ethical standards, occasional bad apples abuse their
research position, consequently tarnishing the reputation of
the scientific community. Accordingly, all available sci-
entific and legal tools should be used to prevent and
eradicate the problem of research fraud, which is giving a
black eye to science. Raising awareness and openly dis-
cussing the issue should foster research environment that is
intolerant to fraud. Colleagues should be encouraged to
report wrongdoing. The stigma associated with raising
concerns (whistleblowing) has to be removed. The role of
peer review in discovering misconduct is important, as a
frequent goal of fraudulent science is to make its way into
the mainstream publications. Journal editors should be
particularly vigilant, as software has made it easier to
commit some types of misconduct (copy-paste of text and
data, image manipulation, etc.). However, the same soft-
ware that can be used to ‘doctor’ digital images—can also
be used to detect image alterations (White 2007). Text-
matching software such as CrossCheck/iThenticate/Turn-
itin (http://www.turnitin.com) can be useful tools to screen
for plagiarism or redundant publication.
Institutions and funders should consider the impacts
their policies may have on scientific misconduct, and
should work to reduce unintended consequences. For
example, in some countries (e.g., Macedonia), researchers
get direct financial rewards for publishing in international
journals with impact factor. Against such backdrop, even
though most researchers would not fabricate results, the
temptation to commit lesser offenses, for example adding
colleagues’ names to papers—becomes harder to resist
under pressure.
Discussions of scientific misconduct seem all too often
to stop at the simplified notion that it is a bad thing (Ball
2008). While ‘‘scientists respond to [fraud] with all the
sting of moral indignation, denouncing it as a crime and
labeling perpetrators as charlatans and scoundrels’’
(Zuckerman 1984), and morality in science may be more
important than science itself (Bouville 2008), community
repugnance has failed to prevent scientific fraud. We all
know that scientific misconduct is so intrinsically wrong,
yet it continues to propagate itself. That is why we believe
that the time has come for harsher punishment of the
fraudsters in science. Temporary debarment from funding
is just a slap on the wrist, and has so far had little deterrent
effect. Instead, criminal prosecutions should send a signal
to the scientific community that fraud is unacceptable.
Promoting biomedical research ethics through
education
Ethics and law applied to biomedicine is an emerging
academic discipline with intrinsic and rigorous standards.
Biomedical research ethics
123
Teaching of RCR should be widely shared within bio-
medical research institutions and medical schools. Both
ethics and law should be introduced systematically in order
to prepare the biomedical students to meet their profes-
sional and legal responsibilities. In a holistic approach,
ethics and law teaching should be features of the whole
curriculum, should begin early, and should fully be inte-
grated with the rest of the curriculum through provision of
courses and workshops not only for students but also for
teachers. Teaching in ethics and law should feature in the
students’ research and clinical experience, consistently
forging links with good biomedical research and practice
(Consensus statement by teachers of medical ethics and
law in UK medical schools, 1998). Through proper edu-
cation, some scientists need to be reminded, others need to
be conditioned in order to achieve ethical purity in
research.
A growing body of literature attempts to evaluate the
effectiveness of RCR education. Some studies have shown
that RCR education can enhance knowledge and under-
standing of ethical concepts, norms, and rules; promote
awareness of ethical issues and problems; improve ethical
reasoning abilities; and shape ethical attitudes; but no
studies have shown that RCR education has a positive
impact on ethical behavior (Plemmons et al. 2006; Powell
et al. 2007; Antes et al. 2009, 2010; May and Luth
2013; Resnik 2014). An inventory of the teaching of ethics
within the medical schools of the EU identified a notable
disparity among schools of medicine in their programs as
well as in the number of hours and in the different cate-
gories of teachers (Claudot et al. 2007). These differences
are not due to a top-down approach, instead they appear to
be endogenously generated, favoring a transversal mode of
teaching which does not require staff with relevant exper-
tise. In the medical schools of the EU, in a manner similar
to the USA (Silverberg 2000), ethics teaching is largely
conducted by doctors or ethics specialists, who may be
trained in ethics, but are not necessarily ‘‘ethicists’’. The
insufficiency of multidisciplinarity (e.g., lack of human
sciences and social sciences input), does not allow for a
confrontation of different points of view, and limits the
possibility of treating dilemmas through the biomedico-
scientific-technical angle, and in a more global manner.
This multidisciplinarity could be fostered by means of
better involvement of ethics committees in the teaching of
ethics (Comite Consultatif National d’Ethique 2004).
Without empirical data it is of course unclear whether
and how much training and education in research ethics
might help reduce the rate of research misconduct. The
answer to this question depends, in part, on how one
understands the causes of misconduct. There are two main
theories about why researchers commit misconduct.
According to the ‘‘bad apple’’ theory, most scientists are
highly ethical. Only researchers who are morally corrupt,
economically desperate, or psychologically disturbed
commit misconduct. According to the ‘‘stressful’’ or
‘‘imperfect’’ environment theory, misconduct occurs
because institutional pressures, incentives, and constraints
encourage people to commit misconduct, such as pressures
to publish or obtain grants or contracts, career ambitions,
the pursuit of profit or fame, poor supervision of students
and trainees, and poor oversight of researchers (Stankovic
2004; Resnik 2011; Hvistendahl 2013).
Fraudulent research often enters the public record
without being detected for years. To the extent that
research environment is an important factor in misconduct,
a course in research ethics is likely to help people get a
better understanding of these stresses, sensitize people to
ethical concerns, and improve ethical judgment and deci-
sion making. Training in research ethics should be able to
help biomedical researchers grapple with ethical dilemmas
by introducing researchers to important concepts, tools,
principles, and methods that can be useful in resolving
these dilemmas.
Conclusion
Medical ethics is necessary to uphold a relationship of
solidarity between the science and art of medicine and the
values that are essential to society. It is also a condition
sine qua non of the appropriate societal response to the
emergence of new biomedical technologies and exigencies
related to patients’ rights. In a societal sphere where hon-
esty is one of the cardinal principles and where human life
may be at stake, research misconduct is perceived as
merely a concern for peer review when it really should be a
major concern for legal review. Peer review alone is not the
proper mechanism to identify it and deal with it (Kohn
1986). Accordingly, there is a need for a more rigorous and
legalized approach towards the control of scientific mis-
conduct. Institutions that conduct research need to strictly
comply with a variety of legal rules, which are generally
laxly enforced. As a matter of policy, guarding the research
integrity through a vigilant system of adjudication will
allow weeding out and deterring the malefactors,
achievement of excellence beyond personal gain, and
retention of the public’s trust (Stankovic 2004). The ability
to care for patients will progress best when the clinical
studies are transparent and adherent to high scientific and
ethical standards, and the ‘‘decade of misconduct’’ (Cossins
2012) will be followed by a ‘‘decade of honor’’.
Research ethics education should be mandatory and
holistic for researchers, medical students, and physicians.
In fact, the issues have become so important that the NIH
and NSF have mandated training in research ethics for
B. Stankovic, M. Stankovic
123
graduate students. Hopefully, concomitant with interna-
tional recommendations, medical schools will progres-
sively grant the importance to ethics instruction that it
deserves. Although teaching alone will not prevent mis-
conduct, it promotes integrity, accountability, and respon-
sibility in research. Policies and law enforcement should
send a clear message that researchers should adhere to the
highest standards of ethics in research. It is vital that
physicians understand basic aspects of law and the legal
system in order to develop understanding of the bio-med-
ico-legal issues not just in the legal context, but with a
sound grounding in ethics, social and theoretical con-
texts—so that they can practice good medicine. Routine
ethics education for biomedical students, researchers, and
resident physicians, and continuing medical education for
practicing doctors, are probably the best ways to accom-
plish this goal. Indeed, bad science anywhere is bad for
science everywhere.
Acknowledgments This work is supported under the European
Commission’s Seventh Framework Programme (FP7). We are grateful
to two anonymous reviewers for helpful comments on an earlier draft.
References
Antes, A.L., S.T. Murphy, E.P. Waples, M.D. Mumford, R.P. Brown,
S. Connelly, and L.D. Devenport. 2009. A meta-analysis of
ethics instruction effectiveness in the sciences. Ethics and
Behavior 19(5): 379–402.
Antes, A.L., X. Wang, M.D. Mumford, R.P. Brown, S. Connelly, and
L.D. Devenport. 2010. Evaluating the effects that existing
instruction on responsible conduct of research has on ethical
decision making. Academic Medicine 85(3): 519–526.
Ball, P. 2008. Crime and punishment in the lab. Nature. doi:10.1038/
news.2008.1015.
Benatar, S.R., and P.S. Singer. 2000. A new look at international
research ethics. BMJ 321(7264): 824–826.
Bouville, M. 2008. Crime and punishment in scientific research.
Physics and Society: arXiv:0803.4058 [physics.soc-ph].
Brand, D. and J.M. Nash. 1987. It was too good to be true. Time, June
1, 1987, at 59 (quoting Dr. Robert Sprague, University of
Illinois, Urbana-Champaign).
Broad, W.J. 1980. Would-be academician pirates papers. Science
208(4451): 1438–1440.
Broad, W.J., and N. Wade. 1982. Betrayers of the Truth. New York:
Simon & Schuster.
Claudot, F., F. Alla, X. Ducrocq, and H. Coudane. 2007. Teaching
ethics in Europe. Journal of Medical Ethics 33(8): 491–495.
Comite Consultatif National d’Ethique. 2004. Opinion on education
in medical ethics, No. 84, April 29, 2004, available at http://
www.ccne-ethique.fr/en/publications/opinion-education-medi
cal-ethics#.Up54RMRDvHQ. Accessed 4 April 2014.
Consensus statement by teachers of medical ethics and law in UK
medical schools. 1998. Teaching medical ethics and law within
medical education: a model for the UK core curriculum. Journal
of Medical Ethics 24(3): 188–192.
Cossins, D. 2012. A decade of misconduct. The Scientist, November
27, 2012, available at www.the-scientist.com/?articles.view/
articleNo/33464/title/A-Decade-of-Misconduct/. Accessed 20
December 2013.
Dalton, R. 1997. The Angelides Affair. Houston: Houston Press.
David, A. 2003. Paper retracted as co-author admits forgery. Nature
421(6925): 775.
Deer, B. 2011. How the case against the MMR vaccine was fixed.
BMJ 342: c5347.
Eichenwald, K. and G. Kolata. 1999. A doctor’s drug trials turn into
fraud. N.Y. Times, May 17, 1999, at 1.
Fanelli, D. 2009. How many scientists fabricate and falsify research?
A systematic review and meta-analysis of survey data. PLoS
ONE 4(5): e5738. doi:10.1371/journal.pone.0005738.
Greenbaum, D. 2009. Research fraud: methods for dealing with an
issue that negatively impacts society‘s view of science. Colum-
bia Science and Technology Law Review 10: 61.
Hasty, S. 2001. Firm Fined for Selling Unapproved AIDS Test. AIDS
Weekly, May 21, 2001, available at. http://www.newsrx.com/
newsletters/AIDS-Weekly/2001-05-21.html. Accessed 4 April
2014.
Hixson, J. 1976. The Patchwork Mouse. Garden City: Anchor Press.
Hvistendahl, M. 2013. China’s publication bazaar. Science
342(6162): 1035–1039.
Hwang, W.S., S.I. Roh, et al. 2005. Patient-specific embryonic stem
cells derived from human SCNT blastocysts. Science 308(5729):
1777-1783, erratum in Hwang, W.S., S.I. Roh, et al. 2005.
Science 310(5755): 1769, retraction in Kennedy, D. 2006.
Science 311(5759): 335.
Jagsi, R., and L.S. Lehmann. 2004. The ethics of medical education.
BMJ 329: 332–334.
Kohn, A. 1986. False Prophets. New York: Basil Blackwell Inc.
Kulynych, J. 1998. Intent to deceive: mental state and scienter in the
new uniform federal definition of scientific misconduct. Stanford
Technology Law Review 2(1998): 2.
LaFollette, M.C. 2000. The evolution of the ‘‘scientific misconduct’’
issue: an historical overview. Experimental Biology and Med-
icine 224(4): 211–215.
Manson, H. 2008. The need for medical ethics education in family
medicine training. Medical Ethics 40(9): 658–664.
May, D.R., and M.T. Luth. 2013. The effectiveness of ethics
education: a quasi-experimental field study. Science and Engi-
neering Ethics 19(2): 545–568.
Opel, D.J., D.S. Diekema, and E.K. Marcuse. 2011. Assuring research
integrity in the wake of Wakefield. BMJ 2011(342): d2.
Parrish, D.M. 1998. The federal government and scientific miscon-
duct proceedings, past, present, and future as seen through the
Thereza Imanishi-Kari case. Journal of College & University
Law 24: 581.
Plemmons, D.K., S.A. Brody, and M.W. Kalichman. 2006. Student
perceptions of the effectiveness of education in the responsible
conduct of research. Science and Engineering Ethics 12(3):
571–582.
Powell, S.T., M.A. Allison, and M.W. Kalichman. 2007. Effective-
ness of a responsible conduct of research course: a preliminary
study. Science and Engineering Ethics 13(2): 249–264.
Resnik, D.B., A. Shamoo, and S. Krimsky. 2006. Fraudulent human
embryonic stem cell research in South Korea: lessons learned.
Accountability in Research 13(1): 101–109.
Resnik, D.B. 2011. What is ethics in research & why is it important?
NIEHS-NIH, available at http://www.niehs.nih.gov/research/
resources/bioethics/whatis/. Accessed 4 April 2014.
Resnik, D.B. 2014. Editorial: Does RCR education make students
more ethical, and is this the right question to ask? Accountability
in Research 21(4): 211–217.
Riis, P. 2001. Scientific dishonesty: European reflections. Journal of
Clinical Pathology 54(1): 4–6.
Biomedical research ethics
123
Schrag, B. 2008. Teaching research ethics: changing the culture of
science. Teaching Ethics 8: 79–110.
Shamoo, A., and D. Resnik. 2009. Responsible Conduct of Research,
2nd ed. New York: Oxford University Press.
Silverberg, L.L. 2000. Survey of medical ethics in US medical
schools: a descriptive study. The Journal of the American
Osteopathic Association 100(6): 373–378.
Stankovic, B. 2004. Pulp fiction: reflections on scientific misconduct.
Wisconsin Law Review 2004: 975–1013.
Steen, R.G., A. Casadevall, and F.C. Fang. 2013. Why has the number
of scientific retractions increased? PLoS ONE 8(7): e68397.
doi:10.1371/journal.pone.0068397.
Swaminathan, V., and M. Avery. 2012. FDA enforcement of criminal
liability for clinical investigator fraud. Hastings Science &
Technology Law Journal 4: 325–356.
Wallis,C. 1983. Medicine: fraud in a Harvard lab. Time, February 28, 1983.
Weiss, R.W., G.G. Gill, and C.A. Hudis. 2001. An on-site audit of the
South African trial of high-dose chemotherapy for metastatic
breast cancer and associated publications. Journal of Clinical
Oncology 19(11): 2771–2777.
White, C. 2007. Software makes it easier for journals to spot image
manipulation. BMJ 334(7594): 607.
Zuckerman, H. 1984. Norms and deviant behavior in science. Science,
Technology and Human Values 9(1): 7–13.
B. Stankovic, M. Stankovic
123