Sign up for Quillette
Sydney. London. Toronto.
No spam. Unsubscribe anytime.
The major food staples are essential to human survival. Chocolate and coffee are not essential, but try to imagine a world without them. One of the numerous concerns with climate change is that many species will lose their habitats. Scientists are projecting that, in the coming decades, this could lead to the extinction of many crops, including cacao and coffee plants.
The cacao tree is native to the Amazon Basin in South America. Over 1,500 years ago, the Mayas and other cultures in South and Central America cultivated the plant, and today over 90 percent of the world's cocoa is grown on small family farms. The cacao plant’s range is a narrow strip of rainforest roughly 20 degrees north and south of the equator, where the temperature, rain, and humidity are relatively constant throughout the year.
Like tropical fruits native to Hawaii, the cocoa industry has been ravaged by fungal infections. In Costa Rica, it never recovered from a fungal outbreak in the 1980s. The most recent outbreak occurred in Jamaica in 2016. Attempts by scientists to breed and create new hybrid varieties have failed. Today, most varieties of the cocoa plant are derived from genetically engineered plants or clones selected in the 1940s, which means they are susceptible to the same fungal diseases from the past.
Today, two West African countries—Côte d’Ivoire and Ghana—produce over half of the world's chocolate. By 2050, rising temperatures could push the current growing regions more than 1,000 feet uphill into mountainous terrain, an area now mostly preserved for wildlife, according to the National Oceanic and Atmospheric Administration. Europe and the United States currently have large markets for cacao, while the demand in India and China is steadily growing. The reduced humidity caused by rising temperatures will make cacao trees extremely vulnerable and threaten the chocolate industry.
In contrast, the coffee plant has a variety of species. Temperature and rainfall conditions are the main drivers of crop yield. Scientists are projecting longer and more extreme periods of rain and drought, and along with rising temperatures, this could reduce the area suitable for growing coffee by up to 50 percent by 2050. An extensive study found that of the world’s 124 wild coffee species, 75 (roughly 60 percent) are at risk of extinction due to climate change.
If scientists can identify which genes of the cacao and coffee plants to modify to adapt to new environmental conditions, they can help them survive. Researchers at the University of California, Berkeley, and Mars, a manufacturer of food products, are collaborating to conserve the cacao plant. In 2008, Mars launched the Cacao Genome Project to identify traits for climate change adaptability and developing higher yield. The researchers are using the genome editing tool CRISPR to edit the DNA of the cacao plant so that it can survive in warmer temperatures and drier conditions, and grow in different climates.
Genome editing allows breeders to introduce new traits more precisely and rapidly, potentially saving years or even decades in bringing needed new varieties to farmers. The US Department of Agriculture (USDA) does not plan to regulate gene-edited plants or crops if they have traits similar to plants developed through traditional breeding techniques. However, the Food and Drug Administration (FDA) has the final say over the safety of food for human consumption.
Other alternatives to coffee-as-we-know-it include artificial coffee produced in a lab, which would require government approval, and moving millions of coffee farmers to new habitats. These options might make genome editing more appealing to activists.
Rational updating: Lessons from behavioral economics
The conventional wisdom is that farming is an organic practice and that tinkering with genes is risky and unethical. Advances in biological sciences have allowed scientists to provide a more balanced perspective on how farming practices and genetically modified organisms (GMOs) have impacted humans and our planet.
During the Agricultural Revolution, our ancestors transitioned from hunter-gathers to domesticating plants and animals. We now know that clearing the land increased global warming and standing water created breeding grounds for mosquitos which contributed to the spread of diseases such as malaria. Forests absorb greenhouse gases and cattle emit methane during the food digestion process. Today, gases emitted by farm animals account for 14–18 percent of global greenhouse gas emissions. Domesticating farm animals has also led to numerous viral infections and poxes in humans which have produced epidemics.
In 1970, President Richard Nixon launched the War on Cancer to better understand the factors related to the development of tumors. Some scientists researched mutations caused by chemicals, the sun, and bombs, while others focused on viruses and bacteria that invade human cells. They discovered the molecular mechanisms used by viruses and bacteria which enabled the field of genetic engineering. They can use the same mechanisms to insert DNA with desired traits.
Genetic engineering has enabled scientists to develop life-saving drugs, produce food sources that can resist fungal and viral infections and live in harsh conditions, and prevent plant species from becoming extinct. In addition, it has stimulated the economy through job creation and hundreds of billions of dollars.
Traditional economics assumes that when humans make decisions they rationally weigh the costs and benefits and calculate the best choices for themselves. Behavioral economics, on the other hand, provides valuable insights into why some individuals do not behave in their own best interests. Some people make irrational choices based on errors and biases, while others utilize slow thinking, acquire new information, and are rational updaters.
Biotechnology companies have offered genetically engineered products in American markets since the 1980s. Given that both genetic engineering and recombinant DNA products have not caused any health and environmental problems during that time period, it is unclear why they are not more universally accepted.
Lessons from the Asilomar model
In the absence of state and federal laws, scientists took it upon themselves to develop a plan on how to proceed safely with genetic engineering or recombinant DNA technology inside the lab. This led to the 1975 Asilomar Conference organized by the National Academy of Sciences held for four days in Pacific Grove, California. One hundred and forty biologists and physicians, four lawyers, and 12 journalists assembled to discuss the potential risks involved with recombinant DNA technology and to discuss and establish the conditions under which research should proceed.
While researching the mechanisms of cancer, scientists discovered that viruses can alter human cells in culture and transform cell lines into a cancerous state. At the time, they were concerned with the potential dangers of viruses if they spread in labs, while activists were worried they might harm the environment and infect humans outside the labs if they were misused. The scientists agreed to a voluntary moratorium on certain types of recombinant DNA experiments and containment on others until the risks were better understood.
Only months after the Asilomar Conference in 1975, as Chairman of the Subcommittee on Health, Senator Edward Kennedy chaired a hearing on genetic engineering and recombinant DNA. Kennedy was initially inclined to have an extended moratorium on research and allow more time for viewpoints from the concerned public and activists. The City Council of Cambridge, Massachusetts, did declare a moratorium on research, an act that was followed by similar bans in a number of other cities.
After further research, however, scientists learned that if they applied a knockout—a technique used to make a harmful gene inoperative—it would make the virus harmless. In his 2001 book, A Passion for DNA, James Watson recalls that Kennedy then did an about-face and said, “Following a period without the determination of any real risks; public hysteria cannot be maintained indefinitely in the absence of a credible villain of recombinant DNA technology.” The Asilomar Conference has provided a successful framework for assessing the risks of emerging technologies.
#1 The importance of scientific literacy
In order to provide oversight in the lab, it is necessary to understand the technical aspects of genetic engineering, and to distinguish between real and perceived risks. Without the technical understanding there is a tendency to conflate oversight of the process and the product. Genetic engineering is a lab procedure used to recombine DNA (a process). Scientists can recombine DNA from two different species in order to produce an animal or plant with a desired trait (a product). Now that scientists have made the lab procedure safe, it is the responsibility of the appropriate federal government agencies—including the FDA, USDA, and Environmental Protection Agency (EPA)—to test and approve the products.
Regulating the risks and safety of automobiles provides an interesting analogy which makes these concepts easier to understand than learning the technical aspects of molecular biology. In the United States, when the production of a Ford vehicle makes it to the Ford factory, the Occupational Safety and Health Administration (OSHA) provides oversight for the safety of the factory workers. As the car moves from a dealership to the highways, the Department of Transportation (DOT) and other government agencies mandate safety precautions such as seat belts, speed limits, laws on texting while driving, air bags, and so on to make driving safer and ultimately lower the fatality rate on American highways.
Human procreation is also a process that recombines DNA. Randomly and through natural selection, it recombines genes from the male and female genomes. In this case, the outcome is more unpredictable. But when DNA is recombined using genetic engineering, genes are selected for their functions.
Scientists later suggested that the National Institutes of Health (NIH) should form a Recombinant DNA Advisory Committee to establish safety guidelines, standards for conducting experiments, and oversight for NIH-funded projects. In 1976, a committee composed of experts in the field set safety guidelines matching the type of containment necessary for different types of experiments. Similar to the containment facilities for research on nuclear weapons during World War II, the levels of risk were categorized as minimal, low, moderate, and high, and required that scientists followed the appropriate safety standards and procedures at each level.
Given the uncertainty of outcomes with human procreation and the guidelines of the committee, one participant at the Asilomar Conference realized that they had just made human procreation a moderate risk experiment.
#2 Proportionality and risks
Historically, scientists have not fully understood the risks of most of the important technological innovations at the time of their invention. Today, the use of GM crops in the United States, South America, and Asia is a mainstream practice. The United States uses the proactionary principle, according to which risk assessment is based on science and self-regulation utilizing experts in the field.
The proactionary approach utilizes proportionality with an equal emphasis on risks and benefits. Restrictive measures are employed only if the potential impact of an activity is both significantly probable and severe, and the restrictions are proportionate to the extent of the risk. With the Asilomar model, manufacturers are held liable for the safety of their products, and regulators must demonstrate that they are not squandering resources and delaying social benefits to address minimal gains in safety.
Some European countries have utilized the precautionary principle for GMOs. With the precautionary approach, the burden of proof is on a manufacturer to prove the health and environmental impacts related to a new product is safe before it is approved. Using the precautionary approach, farmers have suffered financial losses and the benefits of products which can alleviate food shortages and nutritional deficiencies to citizens were delayed or denied.
In 1982, Genentech developed the world’s first genetically engineered drug for patients suffering from Type I diabetes—synthetic insulin. Prior to synthetic insulin, diabetes patients used insulin derived from human cadavers and animal insulin derived from pigs, sheep, and cows. Using the precautionary approach would have significantly delayed the medical benefits to many diabetics.
#3 GMOs are held to a higher standard
Societies do not accept the risks of technologies equally. America’s social contract with automobiles is very different than GMOs. According to the National Highway Traffic Safety Administration, over the last 20 years, Americans have accepted roughly 40,000 annual traffic fatalities in return for a convenience that is engrained as part of our lifestyle. This figure fluctuates with regulations including speed limits and safety features in the automobiles, and technologies such as smart phones with texting which cause distractions.
To ensure citizens safely receive the social benefits of biotechnology, oversight would ideally take place through a rigorous clinical trials process similar to the pharmaceutical industry. However, the drug trials are time-consuming and cost-prohibitive for the development of most industrial products, and most biotechnology companies would likely not pursue development. Even with the time and costs dedicated to pharmaceutical clinical trials, a 1998 study revealed that roughly 106,000 people die each year in American hospitals as the result of the adverse health effects or side effects from prescribed medication.
In the past, failed regulatory oversight of the chemical industry left legacy issues and is a legitimate concern. With irrational fears, perhaps due to the inability to differentiate between science fiction and reality, activists are holding the biotechnology industry to a higher standard than other technologies.
Adapting to the Anthropocene: Lessons from future studies
Thousands of years ago, humans relied on wind and water wheels for power which were neither reliable nor scalable. Prior to the Second Industrial Revolution (1870–1914) human labor and farm animals were the major sources of power. Then electricity and internal combustion engines powered by fossil fuels lifted billions of people out of poverty and contributed to the growth of the middle class, reducing the number of hours they had to work and the amount of disposable income spent on subsistence.
In the 1950s, automobiles led to the rise of the suburban lifestyle in the US with gas-guzzling station wagons, modern kitchens, and numerous household appliances. Unfortunately, the Second Industrial Revolution is also remembered for contributing to climate change, and the start of the Anthropocene epoch is characterized by the influence of human activities on land-use changes, deforestation, and burning fossil fuels which accelerated species’ extinction and global warming.
Scholars have pointed out that, based on these human activities, the Anthropocene would have begun before the Second Industrial Revolution. Paul Crutzen argues that if it began with the production of carbon dioxide and methane at rates sufficient to alter the composition of the atmosphere this would coincide with James Watt’s design of the steam engine in 1784. William Ruddiman suggests that it began even earlier, and that the Agricultural Revolution that began around 8,000 years ago is a more accurate starting point.
Professional futurists look into the past to better understand probable scenarios for the future. Looking towards the future, societies should focus on making climate change manageable using this knowledge and learning how to best adapt to its inevitable effects. Much of the climate change debate is focused on causality. Deforestation, farming, cow flatulence, and using appliances and internal combustion engines are all contributors. Regardless of the politics of climate change, the effects are the same. Hopefully, planners, business executives, and public policy officials will have a game plan for adapting to the Anthropocene.
Given that climate change is occurring, regardless of the cause—natural, man-made, or both—and that genome recombination and editing have enabled scientists to deliver a variety of foods and drugs safely, there is at least one option to prevent numerous species from becoming extinct. Thanks to advances in science and rational thinking, we gourmet chocolate and coffee lovers can continue to feed our addictions. This will confront GM opponents with a trilemma—labeled GM coffee, the artificial variety, or do without.