Myths of Self-Governance

Backstage Decisions, Front-stage Experts: Part 3

So far, we’ve talked about how the 1975 Asilomar conference created a system in which scientists escaped government and public scrutiny. But whose perspectives and voices were sidelined in the process?

Find out in this new installment of Backstage Decisions, Front-stage Experts!


Missed the last section? Click here!

Want to start from the beginning? Click here!


As we discussed in the previous post, genetic engineering scientists in 1974 were praised for establishing an international moratorium on rDNA experiments. This signalled that the scientific community was holding themselves accountable for the products and technologies they produced. Their efforts to prioritize public health and environmental safety over the advancement of scientific knowledge were viewed as a noble sacrifice by the broader scientific community and public leaders[1] When scientists eventually lifted this moratorium during the Asilomar conference, it was praised as an act of solidarity within the scientific community. This resulted in the creation of concrete forms of governance and oversight, including legally enforceable standards and new organizational bodies. [2] 

In the decades that followed, Asilomar came to occupy the memory of scientists as the ideal model of consensus building. 33 years later, conferences and special journal issues continue to remark on it as a key historical moment. [3][4][5][6] The idea that scientists should get together to debate the merits and potential ills of the technology they produce became synonymous with the “Asilomar model.” 

However, the pitfalls and complexities of the deliberations during and after Asilomar are often understated. When Jonathan Moreno, a prominent philosopher and bioethicist, was asked about how contemporary genetic engineering technologies should be governed, he stated “There’s a nearly reflexive tendency to think of Asilomar, but Asilomar has become for biology what Woodstock has become for youth culture—a mythology that’s grown but that obscures how muddy the event itself was at the time.” [7] To develop a critical understanding of the cultural significance of Asilomar, we will explore how this myth set expectations for future decision-making in science and reproduced assumptions about the public’s ability to contribute to these processes.

These new forms of oversight kept scientists at the helm of the scientific enterprise and further developed a system where science operated autonomously, and mostly free from the external scrutiny of government bodies and public interests. Understanding how this autonomy was maintained and evaluating the extent to which public input was integrated into the decision-making process is important for developing more inclusive ways of addressing the social, political and economic implications of new biotechnologies. 

As we looked through old meeting notes, letters between scientists, and old news articles, our research began to dispel the myth of Asilomar. We uncovered voices that were sidelined at the original meeting and found disagreements that arose after 1975 about the guidelines and the legitimacy of new governing bodies. 

A small cast and even smaller stage crew

The conference hall in 1975 was dominated by molecular biologists. The topics guiding the discussion were predetermined by the conference organizers, who were well-decorated academic researchers. This small and specialized group limited the focus to the physical and biological risks of rDNA and effectively discounted any discussion about the role of science in society. [8] At the time of Asilomar, having a small group of only scientists setting the agenda was recognized by observers to be a presumptuous way of deciding what public interests and societal goals should be given attention. 

A decision needed to be made about whether to lift or extend the moratorium. Arguments that opposed lifting the moratorium stressed a need for broader social issues to be considered and recognized that the scientific community alone cannot regulate the development of rDNA technology. For example, Science for the People (SftP), a left-leaning association of scientists, wrote to the organizers of the 1975 Asilomar meeting with a series of concerns they felt needed to be addressed. They argued that having scientists be solely responsible for regulating new biotechnology, “is like asking the tobacco industry to limit the manufacture of cigarettes.” The letter goes on,

There are even broader social issues that must be considered. The growing preoccupation with technologies involving genetic manipulation, and parallel developments such as cell fusion and in vitro fertilization, all point to the application of these techniques for human genetic manipulation. Technology and scientific development, even when labelled biomedical, is not intrinsically socially beneficial. Specifically, technologies pointing to the modification of human genetic material must be examined with the greatest care to understand why they are being so eagerely [sic] developed, and for precisely whose benefit. [9]

Despite this, scientists at Asilomar did not discuss the issues around human genetic modification. Furthermore, biosafety discussions did not include populations that were more likely to be exposed to the health hazards of engineered viruses, one of the most salient being an increased risk of cancer. The open letter from SftP urged that collective decision-making on lab safety include those most immediately at risk (technicians, students, custodial staff, etc.). Instead, senior, well-established scientists who don’t typically work at the lab bench were in charge of setting the guidelines. The choice to ignore the concerns of SftP and disregard stakeholders’ opinions in construction of standards and guidelines established norms for how decision-making progressed following the Asilomar meeting.

Outcomes of Asilomar

If we look at what came out of Asilomar, we start to see traces of the creation of a particular model of biotechnology governance that keeps scientists in control. After the attendees developed provisional biological safety guidelines, scientists from academia continued to gather in small committees to further cultivate their vision of rDNA regulation. These meetings led to the creation of a new Recombinant DNA Advisory Committee (RAC) inside the National Institute of Health (NIH). When first created, the RAC was mostly made up of bacterial geneticists because most of the relevant research was being done in bacteria. Non-scientific members of “the public” were not included in the committee. [10]

Scientists at Asilomar felt comfortable having the NIH enforce the guidelines regarding rDNA research and house the RAC because NIH intramural labs had directed a large amount of monetary and infrastructural support towards molecular biology research in the 1960s. [11] Involving the NIH would both strengthen the position of expert committees and grant them legitimacy in the face of public scrutiny.

The guidelines that scientists had finessed after Asilomar were taken on by the RAC as a set of interim rules for federally supported laboratories. [12] The intent of these guidelines was to reduce risks to lab personnel and provide guidance on how to conduct work using rDNA. In 1976, the NIH published revised guidelines in the Federal Register that classified different types of experiments and specific instructions for how to monitor lab work for those experiments. Over time, these classifications led to the current safety protocols in academic and industry laboratories. 

However, six months after the guidelines were issued, it was clear they required revision. For example, the knowledge of infectious disease experts and environmentalists had been excluded from the guidelines. Additionally, issues around the role of industry in science emerged in the years that followed. Scientists at the NIH and in academia began to recognize that the accepted guidelines did not apply to research being conducted with private funds or in industry. The only way the RAC could enforce the NIH guidelines, after all, was by restricting or removing funding from laboratories conducting experiments that did not comply with the guidelines. [13] In 1976, a Federal Interagency Commission was formed in order to review existing research in private and federally funded labs and determine whether wider legislation was necessary. [14] The need for revision was voiced by Donald Frederickson, the director of the NIH and RAC. Reflecting on the process, he wrote, “The more we embedded the Guidelines in inflexible administrative molds, the less chance there would be for timely accommodation of the tide of new information that was already rising.” Frederickson also helped implement and advocate for public hearings to deliberate on the proposed guideline revisions, and worked to increase public participation in the RAC by requiring non-scientific community members to be involved in RAC activities. Twisting the spirit of Frederickson’s move toward establishing more formal inter-agency governance of recombinant DNA, other scientists used the inclusion of members of the public into the RAC as an argument to lobby against the conversion of guidelines into legislation because it signalled that scientists would work with the interests of the public in mind. [15] These outcomes would go on to influence how scientific communities perceive and engage with various publics.

Public engagement: The reinforcement of the information deficit model

The memory of Asilomar perpetuates the mindset that there needs to be a distinction between the decision-making processes involving the scientific community and those involving the public. 

Following the release of the NIH guidelines on containment strategies for rDNA, public discourse continued around the lack of transparency in conversations regarding new technology and engagement with the various public stakeholders. During this time, institutions heavily invested in unhindered scientific progress engaged people outside of the scientific community through schooling and media in order to promote scientific literacy and communication between scientists and the public. These initiatives included funding high-quality STEM education in K-12, training teachers to teach STEM courses, and supporting government and non-government programs that foster the early learning of science and science outreach to families and communities. Many of these initiatives viewed the public as a homogeneous stakeholder deficient in scientific knowledge.

Appeasing the public became crucial because any fluctuations in the public’s perception of the value, meaning, and usefulness of science could leave the funding of scientific research politically vulnerable. Unfortunately, these early initiatives, and even present day initiatives, treat  the public as a homogeneous entity that lacks scientific knowledge. This practice comes from the mindset that once the people of the public have the necessary scientific information, they will understand the rationale behind specific scientific decisions and see the world as scientists do. In this situation, the public becomes an “other” which creates a harmful and ineffective “us vs them” dichotomy that reinforces public stereotypes. 

This view of the public and science outreach is reflected in curricula. Science curricula tend to emphasize the original definition of scientific literacy that focuses on people’s understanding of scientific words, general scientific processes, and policy issues related to science, [16] rather than discussing the values and interests of those affected. [17] Even at the graduate level, few STEM graduate programs have formal training in science communication and social science practices that would enhance students’ awareness of how people form opinions about science. Additionally, those outside of scientific circles have lived experiences informing their opinions about science that need to be acknowledged as useful in shaping the trajectory of science-related decisions.

Given the importance of scientific and technological knowledge for global economies, health and wellness, and the environment, it is imperative that we understand the effects of this kind of boundary-making between scientists and the public, which fails to acknowledge the experiences and knowledge of historically disadvantaged populations. In the following post we describe how the boundary has restricted the types of questions being asked of new technologies and their uses. The myth of Asilomar continues to be used as a cognitive tool that justifies the prioritization of key scientific stakeholders in discussions about emerging technologies. We will talk about some of these other examples in our next post, and discuss the application of the Asilomar model to the development of a more recent biotechnology.


Click here for the next part!

Click here for the entire series!


[1] Gisler, P., & Kurath, M. (2011). Paradise Lost? “Science” and “the Public” after Asilomar. Science, Technology, & Human Values,36(2), 213-243. Retrieved from www.jstor.org/stable/41149049

[2] Krimsky (2005) From Asilomar to industrial biotechnology: Risks, reductionism and regulation, Science as Culture, 14:4, 309-323, DOI: 10.1080/09505430500368998

[3] Krimsky, S. (1982). Genetic alchemy. Cambridge, MA: Mit Press.

[4] Davatelis, George (2000) “The Asilomar Process: Is It Valid?” The Scientist (Accessed online: https://www.the-scientist.com/opinion-old/the-asilomar-process-is-it-valid-56079)

[5] Berg, Paul (2008) “Meetings that changed the world: Asilomar 1975: DNA modification secured.” Nature 455: 290-291. 

[6] N.A. (2015) Editorial: “After Asilomar”, Nature

[7] Bosley, K. S., Botchan, M., Bredenoord, A. L., Carroll, D., Charo, R. A., Charpentier, E., … & Greely, H. T. (2015). CRISPR germline engineering—the community speaks. Nature biotechnology, 33(5), 478-486.

[8] Later, issues about whether the integrity of science was at stake would be brought about in the 1980s by the commercialization of biotechnology (Jasanoff et al, 2015) and the development of intellectual property and patent systems in universities (Berman 2008, Hackett 2014). 

Jasanoff, S., Hurlbut, J. B., & Saha, K. (2015). CRISPR democracy: Gene editing and the need for inclusive deliberation. Issues in Science and Technology, 32(1), 37.

Popp Berman, E. (2008). Why did universities start patenting? Institution-building and the road to the Bayh-Dole Act. Social studies of science, 38(6), 835-871.

Hackett, Edward J.(2014) “Academic capitalism.” Science, Technology, & Human Values, Vol. 39, Issue 5: 635-638.

[9] Genetic Engineering Group of Science for the People (1975) Open Letter to the Asilomar Conference on Hazards of Recombinant DNA

[10] Wivel, N. A. (2014). “Historical perspectives pertaining to the NIH recombinant DNA advisory committee.” Human gene therapy, 25(1), 19-24.

[11] Frederickson, D. S. (1991). Asilomar and Recombinant DNA: The End of the Beginning. In K.E. Hanna (Ed.), Biomedical Politics (pp. 258-298). National Academy Press Washington, DC: National Academy of Sciences.

[12] Vigue, C. L., & Stanziale, W. G. (1979). Recombinant DNA: History of the Controversy. The American biology teacher, 41(8), 480-491.

[13]Wade (1976) “Recombinant DNA: Guidelines Debated at Public Hearing,” Science, New Series, Vol. 191, No. 4229 (Feb. 27, 1976), pp. 834-836

[14] N.A. (1977) “Recombinant DNA: Key Documents” Newsletter on Science, Technology, & Human Values, No. 20 (Jun., 1977), pp. 1-2 

[15] McClean, P. (1997) “The Recombinant DNA Debate,” (Accessed online 9/20/2019 at https://www.ndsu.edu/pubweb/~mcclean/plsc431/debate/debate2.htm)

[16] Miller, JD (1983) Scientific literacy: A conceptual and empirical review. Daedalus 112(2): 29–48. DOI: 10.2307/20024852.

[17] Liu, X. (2009). Beyond science literacy: Science and the public. International Journal of Environmental and Science Education, 4(3), 301-311.


Santiago Molina (he/they) is a sociologist and proud dog dad living in New Orleans. They research how social orders are (re)produced alongside the development of new genetic technologies. They also obsess constantly over their house plants and enjoy a good video game.

Gordon Pherribo (he/him) is a Black Queer microbiologist living in Oakland, CA. He was raised in New Jersey and has a deep fondness for nature and wildlife. His research interests explore both science culture in doctoral training programs and nutrient interactions in microbial communities.