Science-Based Policy Development in the Environment, Food, Health, and Transport Sectors

This one-day symposium explored the interaction between science and policy development in the regulation of the environment, food, health and transport. It consisted of a series of case studies illustrating the impact of science on policy development. The controversy surrounding the science behind the study of global warming and the resulting focus on the reduction of carbon dioxide emissions by international agreement and by national and international regulation is one example of such an area where science and policy development are inextricably intertwined. The symposium is one of a series which is seeking to identify other areas where science-based policy development is of increasing importance and was cosponsored by CINF, AGFD, ANYL, ENVR and MEDI.

The first speaker in the half-day session was Thomas A. Duster, who spoke about Adaptive management tools for engineered nanomaterials in municipal wastewater effluents.” Engineered nanomaterials in consumer products are everywhere and result in delivery to municipal wastewater treatment systems where they may be subsequently discharged to the environment. At sufficient concentrations, many common nanomaterials, including titanium dioxide nanoparticles and carbon nanotubes, are toxic or disruptive to aquatic organisms. Application of contemporary environmental policies poses significant challenges when trying to mitigate these potential impacts. For example, the traditional standards-to-permits approach of the Clean Water Act (CWA), which applies to most wastewater treatment plant effluents in the United States, typically involves the development of contaminant-specific water quality criteria. However, existing research regarding the detection, fate, and toxicology of nanomaterials is still in its infancy and rapidly changing, thereby limiting the ability of policymakers to justify and establish static effluent discharge standards for these emerging contaminants.

Thomas described an adaptive nanomaterials management approach that strives to bridge the gap between significant scientific uncertainties and an ostensive need for some type of policy structure. At the core of this adaptive management procedure is a robust mechanism for information and data organization, which is programmed to alert policymakers of convergence in the literature among: (a) observed and/or anticipated concentrations of target nanomaterials in wastewater effluents; (b) demonstrated impacts of these concentrations on aquatic organisms or ecological function; and (c) our technological capacity to reliably detect these target nanomaterial concentrations. The confluence of these factors is expected to be a significant trigger in evaluating the need for specific management actions and/or expansion of policies related to the release of engineered nanomaterials to environmental systems. Finally, Thomas described how specific elements of this approach may be applied to policy challenges for other emerging contaminants.

Our second speaker was Frederick W. Stoss, who described the “Role of STEM data and information in an environmental decision-making scenario: the case of climate change.” The 1997 Kyoto Protocol to the United Nations Framework Convention on Climate Change (FCCC) established agreements for reducing greenhouse gas (GHG) emissions. Every national academy of science states that anthropogenic sources of GHGs, caused by human activities, impact the Earth’s climate. However, “climate deniers” claim there is no scientific basis for climate change and that it is a well orchestrated hoax. So contentious were these allegations that computers of the Climatic Research Unit at the University of East Anglia were “hacked” and email messages and reports became “evidence” of this “scientific hoax.” Results included disruptions of FCCC policy negotiations and erosion of public confidence in the science of climate change. In his presentation, Fred investigated the growth of climate information, defined different levels of understanding of and access to information, provided a context by which information is generated, and presented a model demonstrating the role of scientific data and information in an environmental decision-making model.

The third speaker was Helena Hogberg, who presented work on the “Identification of pathways of toxicity to predict human effects” which she coauthored with Thomas Hartung. The 2007 National Research Council report "Toxicity Testing in the 21st Century: a vision and a strategy" has created an atmosphere for change in the U.S. It suggested moving away from traditional (animal) testing to modern technologies based on pathways of toxicity. These toxicity pathways could be modeled in relatively simple cell tests. The NIH is funding, by a transformative research grant, The Human Toxome project led by Center for Alternatives to Animal Testing. The project also involves U.S. EPA ToxCast, Hamner Institute, Agilent and members of the Tox-21c panel. The goal is to develop a public database of pathways, the Human Toxome, to enable scientific collaboration and exchange.

An area of toxicology where Tox-21c could have significant impact is developmental neurotoxicity (DNT). Current animal tests for DNT have several limitations, including high costs ($1.4 million per substance), and require substantial time. In addition, there are scientific concerns regarding the relevance of these studies for human health effects. Consequently, only few substances have been identified as developmental neurotoxicants. This is a concern as evidence shows that exposures to environmental chemicals contribute to the increasing incidence of neuro-developmental disorders in children. Moving towards a mechanistic science could help identify the perturbed pathways that are likely to lead to these adverse effects. DNTox-21c is a CAAT project funded by FDA that is aiming to identify pathways of developmental neurotoxicity using a metabolomics approach.

Beside the technical development of new approaches, a case was made that we need both conceptual steering and an objective assessment of current practices by evidence-based toxicology.  Applying an approach modeled on Evidence-based Medicine (EBM) was suggested, which over the last two decades has demonstrated that rigorous systematic reviews of current practices of studies provides powerful tools to provide health care professionals and patients with the current best scientific evidence for diagnostic and treatment options.

The first speaker after the intermission was Rodger Curren, who addressed the topic of the “Role of education and training in supporting science-based policy development” that was co-authored with Hans Raabe and Brian Jones. Policy changes, especially in the regulatory requirements for the safety of new products, are often impeded because decision makers in national regulatory bodies are unaware of the science supporting new methodologies. This is not entirely unexpected since such individuals may be more exposed to political concerns on a daily basis then scientific ones. A current example is the area of non-animal methods for toxicity testing where significant international differences in acceptance exist. Europe and the U.S., for example, are quickly moving to using human-derived cells and tissues rather than whole animal based models. Other countries, such as China, may be reluctant to make a change because their scientists have not had sufficient time to develop sound databases of information. The authors have found that providing specific hands-on training and education on standard methods directly to regulators and scientists in these countries has significantly improved the recognition and acceptance of new approaches.

The next speaker was Julie Jones, who highlighted Policy divergence in the absence of science: The case of e-cigarettes,” a presentation co-authored by David Lawson. Over the past five years electronic cigarettes (e-cigarettes) have emerged as a new consumer product that is being used by an increasing number of smokers who are seeking less risky alternatives to conventional cigarettes. E-cigarettes tend to be designed to look and feel similar to conventional cigarettes, but they do not contain tobacco. They are battery-powered devices that produce an aerosol usually containing nicotine. Currently, there is significant inconsistency in the way that e-cigarettes are being regulated: e-cigarettes are banned in some countries or are being regulated either as medicinal, tobacco or general consumer products in others. There is also a diversity of views regarding the potential role that e-cigarettes could play in helping to reduce the public health impacts of tobacco use. In fact, the science to support this emerging category of products is still under development, and there are many gaps. E-cigarettes represent a timely case study on policy development for regulation of a new product category in the absence of a solid scientific foundation. Julie presented her views on how the development of such a scientific foundation might be accelerated to help inform development of an appropriate regulatory framework for e-cigarettes.

In a related paper, Christopher J. Proctor discussed the Role of regulatory science in reducing the public health impact of tobacco use,” co-authored by Chuan Liu. The U.S. FDA, through the 2009 U.S. Family Smoking and Prevention Tobacco Control Act, is introducing a variety of regulations aimed at reducing the public health impact of tobacco use. These include considering the levels of harmful and potentially harmful constituents of tobacco products and regulations governing modified risk tobacco products. The FDA has set out a series of research questions that it believes are needed to underpin its regulatory proposals and has initiated a large research funding program, in association with NIH. Other scientific advisory groups, including the World Health Organization’s Scientific Advisory Committee on Tobacco Product Regulation have also listed research needed to assist the development of science-based public policy on tobacco. Christopher summarized the research questions being framed by regulators as related to product regulation, and provided some views on how the development of regulatory science in tobacco might be accelerated.

The final speaker, David Richardson, described Systematic and structural risk analysis approaches for establishing maximum levels of essential nutrients and other bioactive substances in fortified foods and food supplements.” Nutritional risk analysis addresses the essential nutrients and other substances with nutritional and physiological effects and the risk to health from their inadequate and/or excessive intake. David reviewed the principles of risk management in order to underpin regulatory developments around the world to establish maximum amounts of vitamins and minerals and other substances in fortified foods and food supplements. Proposed science-based risk management models for public health decision-making take into account international risk assessments and (1) the tolerable upper intake levels (ULs) for vitamins and minerals, (2) the highest observed intakes (HOIs) for bioactive substances for which no adverse effects have been identified, and (3) the contributions to total intake from conventional foods, fortified foods and food supplements. These models propose the allocation of nutrient substances into three categories of risk and maximum levels in order to protect consumers, both adults and children, from excessive intakes.

William Town, Symposium Organizer