SETAC Globe - Environmental Quality Through Science
SETAC Barcelona Session Summaries
Special Issue

Return to Overview

Special Sessions from SETAC Barcelona

Click to expand/collapse

  • Commuter Air Quality in Rail Subway Systems: Current Understanding and Future Mitigation
    • Teresa Moreno (IDÆA – CSIC)

      Millions of people worldwide commute using subway systems and are routinely exposed to levels of contaminated air that are illegal above ground. In Europe alone, more than 60 cities utilize rail subways to facilitate commuter movement. With average return journey times lasting around one hour, underground commuters inhale particulate matter at concentration levels that can be higher than the 50 μg/m3 mean PM10 (Particulate Matter <10 microns in size) limit legally imposed for outdoor European city air. In addition, inhalable particles on platforms are very different from those in the outdoor environment. Despite this one negative aspect, underground rail in general can be considered as “environmentally clean,” facilitating commuter travel and reducing air pollution in the city above ground. Although there are currently no official regulations or recommendations with regards to air quality underground or indoors, several working groups are emphasizing the importance of including legislation for indoor air quality. Given this context, this special session aims to be part of a high-profile attempt to guide legislative frameworks towards more effective control of indoor air quality.

      The session was chaired by Teresa Moreno (IDÆA, CSIC, Barcelona, Spain) and involved a far-ranging discussion on the key issues relevant to subway air quality within the frame of the IMPROVE LIFE project, which aims to assess air quality in the subway facilities and propose measures to achieve cleaner public transportation, thus benefitting both users and employees. The session consisted of five platform presentations and five posters. First, Lidia Morawska (Queensland University, Brisbane, Australia), an expert in indoor air, outlined her work on infection spread in public transport, highlighting the importance of the proximity of passengers and the duration of each trip. Morawska sees a gulf of thinking between clinicians, engineers and scientists on this subject, and she explained that ventilation is not the only mechanism to control infection spread in any transport system, including our cars. Caroline Duchaine (Université Laval, Quebec, Canada) emphasized the need to develop surveillance methods of disease transmission in public vehicles, summarizing the results of a joint Laval/CSIC study on bioaerosols found in the Barcelona Metro. Bioaerosols in this unique environment are relatively unknown and have humans as main sources unlike most situations outdoors. Moreno outlined the objectives of the IMPROVE LIFE project and stressed the importance of the main variables affecting air quality in the subway. Thus, the air quality of a given subway platform involves a complex interplay of the ventilation system, station depth and design, train speed, frequency, wheel materials and braking mechanisms, and number of passengers being transported. Frank Kelly (King’s College, London) overviewed progress on a currently ongoing study of air quality in the London Underground system, the oldest in the world. In this system it is clear that air quality largely depends on the number of kilometers underground of each line (45% of the London tube is underground), with particulate matter levels being higher at deeper stations and dropping to ambient outdoor levels within 5 minutes when traveling above ground. Finally, Alberto Giretti (March Polytechnic University, Ancona, Italy) used his engineering expertise to demonstrate the need for intelligent control of subway ventilation systems using sensor networks. Giretti showed how simulation results can be used to estimate pollutants exposure levels for passengers, and how the dynamic of pollutants in a given station is dependent on both external (meteorological conditions) and internal (piston effect, passenger flow) factors.

      The subsequent general discussion, led by the chairwoman and involving both panel members and the audience, ranged across a number of subjects with the following conclusions:

      1. There is a need to compare the transient doses received when using different types of transport and balance any negative health effects against positive effects such as the well-documented cardiovascular benefits of exercise when walking or cycling
      2. What is the health significance of short high exposures? What can be done to reduce exposure?
      3. There are technical solutions in the case of underground air quality, involving the adoption of new or improved systems of maintenance and energy use such as regenerative braking and the use of intelligent sensors to allow monitoring and control in real time
      4. Any suggested improvements must be cost effective and technically feasible. For example, can brake manufacturers be encouraged to change the metal content of their pads to produce more “ecologically sensitive” compositions and PM emissions?
      5. The IMPROVE LIFE project needs to identify best practices to improve air quality underground and share this information with other subway operators around the world

      Author’s contact information:

  • Engineering in vivo Models for Advancing Environmental Hazard and Risk Assessment
    • Charles Tyler, Ross Brown, Tetsu Kudoh and Jon Green (University of Exeter); François Brion (INERIS); Andrew Tindall and Petra Spirhanzlova (Watchfrog); Daniel Gorelick (University of Alabama); and Helmut Segner (University of Bern)

      There is increasing recognition internationally of the need to modernize and better focus the environmental hazard and risk assessment of chemicals, including via the development and application of mechanistic and predictive tools and systems 1, 2, 4–6.  Promising examples of integrative systems for better understanding chemical targets and effect mechanisms include genetically engineered (transgenic) in vivo models 3.  In these models, “reporter” genes, linked to fluorescent marker proteins, are capable of signaling the activation of specific receptors or enzymes by chemicals, and they can provide rapid, visual, high-content screens of molecular initiating events (MIEs) and downstream physiological, pathological and functional effects. 
      The aim of this session was to discuss the advantages and challenges concerning the application of transgenic in vivo models, principally fish, and amphibian models in chemical hazard and risk assessment.

      Detail of the Session Talks
      After a brief introduction describing the basics of what it means to be transgenic, the state of the science and the range of transgenic models currently available were presented by Charles Tyler (University of Exeter), the attributes what constitutes a “model” organism were discussed in the context of biomedical and environmental research, (eco)toxicology and chemical hazard and risk assessment by Helmut Segner (University of Bern).  A series of specific case studies on zebrafish (Danio rerio), medaka (Oryzias latipes) and African clawed frog (Xenopus laevis) were then presented, which illustrated the potential for transgenic in vivo models to assess chemical effects initiated via range of receptor, enzyme and other protein targets, across multiple tissues and organs, quantitatively, in real time (thus enabling spatial and temporal profiling). Chemical target-specific case studies included:

      1. Androgen-responsive spiggin-GFP (green fluorescent protein) medaka by Andrew Tindall (Watchfrog)
      2. Estrogen-responsive ERE-GFP zebrafish by Tetsu Kudoh and Jon Green (University of Exeter)
      3. Estrogen-responsive cyp19a1b-GFP zebrafish by François Brion (INERIS)
      4. Glucocorticoid and aryl hydrocarbon receptor-GFP zebrafish by Daniel Gorelick (University of Alabama)
      5. Thyroid hormone TH/bZIP-GFP Xenopus tadpoles by Petra Spirhanzlova (Watchfrog)

      Some of the above models have been further developed for high-content, semi-automated, medium-throughput chemical hazard screening by Green, and some models have been shown to be responsive at environmental exposure levels with direct relevance to environmental risk assessment and chemical release or effluent discharge monitoring by Kudoh and Gorelick. Practical issues associated with the validation of transgenic in vivo models, via OECD ring testing, were also discussed during the session by Brion. In parallel with their application in chemical testing and adverse outcome pathway analysis as presented by Ross Brown (University of Exeter), it is expected that the above models will aid fundamental research to help better understand homeostasis, steroidogenesis and the functional roles of natural hormones and chemical mimics in physiological processes, metamorphosis and sexual differentiation and development in vertebrates. 

      Major Conclusions of the Session
      The session promoted discussion from academic and industrial researchers and regulators concerning the development and engineering of transgenic animals, principally fish, for use in ecotoxicity testing of chemicals, alongside traditional tools, primarily in the context of environmental hazard and risk assessment.

      Transgenic in vivo models offer an elegant solution for high-content and medium- to high-throughput screening of chemical effect mechanisms via the visualization and quantification of fluorescent reporters linked to specific chemical targets (receptors, enzymes, proteins) in specific organs and tissues in fish and amphibian models.

      The speed and sensitivity of assays depend on the level of automation, duration of fluorescence measurement, and level of discrimination (spatial and temporal) of image analysis.  A key factor affecting in vivo model sensitivity was whether the line was homozygous (some of the reported transgenic lines were heterozygous, and this resulted in less consistent responses). The majority of existing models appear stable, and sensitivity is sufficient in some of these to detect chemical effects at environmentally relevant exposures.

      The range of transgenic fish models currently available have high specificities for key chemical receptor targets, including androgen, estrogen, glucocorticoid, aryl hydrogen and thyroid hormone receptor(s), and for enzyme targets, including cytochrome P450 enzymes involved in steroid hormone synthesis and xenobiotic metabolism. The specificity of transgenic models may be affected to some degree by the promiscuity of some receptor targets and the fact that chemical compounds may often have multiple modes of action. A good example for this was illustrated by Gorelick for interactions of various steroids (androgens, thyroid hormones, progestagens and glucocorticoids) with the glucocorticoid receptor. These issues can be overcome by generating mutant transgenic lines in which certain receptors are inactivated or in which multiple transgenes are inserted and alternatively fluorescently coded for different receptor or enzyme mediated responses.
      Transgenic model development in separate research institutes has shown that a number of alternative transgene constructs can be used successfully to differentiate chemicals based on effect mechanisms and potencies, and also delineate their tissue or organ targets in the body. Agreement between alternative models and between laboratories using the same models in OECD ring testing provides confidence in the validity and reliability (repeatability) of transgenic models. 

      Fluorescence responses in transgenic models can be used to quantify molecular initiating events (e.g., receptor activation) and downstream physiological responses (e.g., spiggin protein production).  However, effort is needed to calibrate these quantitative response biomarkers with adverse effects on individuals and populations (e.g., via adverse outcome pathways).

      Some of the challenges in transgenics highlighted included the potential susceptibility to methylation (silencing) for the signal amplification targets crucial for the fluorescence induction system. This technical problem, however, is likely to be temporary as new available sequences that are not subject to such methylation can be substituted into some of the established transgenic lines.

      The session illustrated that more systematic approaches, which carefully consider not only the target sequence of interest (e.g., the receptor binding site) but also the number of target sequences incorporated, their orientation, the spacing between the target sequences, and the types of minimal promoters used, are likely to yield more positive and consistent outcomes in creating transgenic lines.

      The sharing of resources in the form of established transgenic models and techniques for developing further models will be a major output from this special session, and this is seen as a major potential benefit to chemical hazard and risk assessment as well as biomedical and environmental research.

      Please contact us if you would like to know more about the availability and potential utility of transgenic in vivo fish or amphibian models.

      Authors’ contact information:,,,,,,,, and

      1. Bradbury S P, Feijtel T C, van Leeuwen C J. Meeting the scientific needs of ecological risk assessment in a regulatory context. Environ. Sci. Technol. 2004 Dec;  38(23): 463A–470A.
      2. EC (European Commission). SCHER, SCENIHR, SCCS Opinion on: Addressing the New Challenges for Risk Assessment. Brussels (Belgium): European Commission; 2013.154 pages.
      3. Lee O, Green JM, Tyler CR. Transgenic fish systems and their application in ecotoxicology.  Critical Reviews in Toxicology. 2015 Feb; 45(2): 124-141
      4. National Research Council (NRC). Toxicity Testing in the 21st Century: A Vision and a Strategy. Washington, DC: The National Academies Press; 2007. 196 pages
      5. National Research Council (NRC). Science and Decisions: Advancing Risk Assessment. Washington, DC: The National Academies Press; 2009.
      6. National Research Council (NRC). Exposure Science in the 21st Century: A Vision and a Strategy. Washington, DC: The National Academies Press; 2012
  • Epigenetic Effects of Chemicals in Ecological and Human Risk Assessment: Challenges and Perspectives
    • Michael Eckerstorfer and Helmut Gaugitsch (Environment Agency Austria), Benjamin Piña (IDAEA-CSIC, Spanish National Research Council) and Jean-Lou Dorne (European Food Safety Authority)

      The assessment of epigenetic effects resulting from exposure to environmental stressors, including chemicals, is an emerging topic for human and ecological risk assessment in Europe and at the international level. The session featured inputs from experts from academic and regulatory science to address this emerging field. The presentations and discussions highlighted the relevance of epigenetic mechanisms as well as the underlying toxicological and associated effects. Epigenetic mechanisms are central to a number of biological processes relevant for risk assessment in the human health area (carcinogenesis, developmental and reproductive toxicity, etc.) as well as the environmental area (reproductive toxicity, trans-generational effects and resistance to toxicants, etc.).

      The keynote lecture by James Trosko (Michigan State University) addressed key concepts including the idea that epigenetic effectors do not act by introducing genetic mutations and usually do not lead to direct cytotoxic effects. Such effectors can act in a reversible manner and may only lead to adverse effects during specific developmental phases or under specific exposure patterns. Epigenetic pathways regulate endogenous biological processes such as cell differentiation and development, and chemicals may interfere with such endogenous functions. The challenge is to distinguish adverse effects initiated by environmental stressors from biological and physiological variability in epigenetic pathways. As pointed out by Marcel Leist (University of Konstanz), key challenges remain with regards to the empirical quantification of effects, the assessment of combined effects of multiple environmental stressors, which may disrupt epigenetic homeostasis and the assessment of specific exposure patterns relative to temporal windows of sensitivity.

      Dave Spurgeon (United Kingdom Centre for Ecology & Hydrology) showed that some of the molecular mechanisms of epigenetic regulation are also present in a number of invertebrates such as snails and earthworms. These mechanisms are involved in adaptive processes in earthworms to cope with environmental stressors (e.g., toxicants like arsenic and other metals). Such epigenetic mechanisms may be triggered in earthworm populations in a linage-specific way as an alternative to genetic mechanisms of adaption. Hence, epigenetic mechanisms are involved in invertebrate responses and potentially adverse outcomes to environmental toxicants and pollutants in a very complex manner.

      The requirement for comprehensive assessments according to current EU regulation frameworks mandating the analysis of all available scientific information for substances like pesticides, implies that evidence of epigenetic effects need to be assessed under the present frameworks. Current testing methodologies (e.g., for reproductive toxicity) partly address adverse effects mediated by epigenetic pathways of regulation; however, the mode of action or adverse outcome pathways involved remain to be elucidated for a variety of relevant effects.

      Hannele Huuskonen (European Chemicals Agency) and Jean-Lou Dorne (European Food Safety Authority) concluded that the assessment of relevant epigenetic effects still needs to be properly integrated into the current risk assessment frameworks. The two presentations and the following discussion indicated a number of methodologies and experimental designs were needed to achieve such an integration. For example, human stem cells seem to be key as a tool for the analysis of epigenetic effects involved in cell differentiation, whereas some of the molecular mechanisms implicated in epigenetic regulation can also be assessed in whole non-vertebrate metazoans, particularly annelids and mollusks.

      In order to refine current risk assessment approaches to account for relevant epigenetic effects, risk assessors and the scientific community need to address a number of challenges; some of which have been identified in this session and included as follows:

      1. The need to develop appropriate model systems for the characterization of epigenetic effects (e.g., stem cell models)
      2. The development of robust methodologies to quantify relevant epigenetic effects
      3. The use of current knowledge on epigenetic effects for the development of adverse outcome pathways and mode of action-based frameworks which can be applied in ecological and human health risk assessment
      4. The investigation of non-canonical ecotoxicological and toxicological effects, which may be caused by epigenetic effects on (endocrine) signaling pathways in organisms
      5. The assessment of the toxicokinetic profile of epigenetic effectors and their impact on metabolism to allow for the sound determination of internal exposures
      6. The identification of “signature characteristics” of substances with regards to adverse epigenetic effects for screening or prioritization for further assessment
      7. Efforts to characterize modes of action or adverse outcome pathways, which include epigenetic mechanisms as key events, to support weight of evidence-approaches in risk assessment.

      While the session highlighted the ongoing progress to address aspects of epigenetics relevant for risk assessment, further discussions involving both the research community and risk assessors are required. Finally, the conclusions raised during this special session further support the need for a follow-up debate facilitated by SETAC and the relevant authorities at national and EU-level, including the European Chemicals Agency and European Food Safety Authority.

      Authors’ contact information:,, and

  • How Can “-omics” Help REACH?
    • Bruno Campos (Institute of Environmental Assessment and Water Research), Stuart Marshall (Unilever), Wim DeCoen (European Chemicals Agency), Mark R. Viant (University of Birmingham), Carlos Barata (Institute of Environmental Assessment and Water Research), Ben Brown (Lawrence Berkeley National Laboratory) and John Colbourne (University of Birmingham)
    • A scientific discussion entitled “How ‘-omics’ can help REACH” took place at the SETAC Europe annual meeting in Barcelona on 5 May 2015. This special session assembled academic scientists, risk assessors from industry, regulators and policy makers to explore the application of multi-omics technologies and assays as a decision-making tool in regulatory science. We aimed to find a solution to the chemical control dilemma for improving environmental health protection while supporting the needs of industry for cost-effective innovation. Further, we sought input from policy makers on how to apply new biotechnologies to implement existing EU regulatory frameworks such as the REACH Regulation, the Water Framework Directive (WFD) and other legislative frameworks for chemical regulation around the world. The session proved highly popular with more than 500 attendees present.
      REACH requires safety assessment and risk management for all chemicals used in Europe in quantities of more than one ton per year; the WFD requires assessment and management of ecological and chemical water-quality for inland, transitional and coastal water bodies in Europe. These are critical and complex legislations to ensure public safety and a healthier environment. Yet, there are many scientific constraints on our ability to measure risks for human health or the environment. In particular, standard procedures for toxicity testing can be expensive, and, for human risk assessments, require the extensive use of mammalian models. To support these EU regulatory frameworks, we require novel, innovative and coordinated scientific approaches that are rapid, cost effective and consistent with European ethical standards for the treatment of animals. A proposed solution, and the focus of our discussion, is to apply high-throughput toxicity testing with data-rich “-omics” approaches (e.g., genomics, transcriptomics, proteomics, metabolomics) in non-mammalian model organisms plus in vitro models. Key questions that arose include the reliability and expense of these technologies, and the extent to which “-omics” measurements can be used to discover modes of action (MoA) and adverse outcome pathways for humans and the ecosystems. Furthermore, to answer these questions, a shared understanding of “risk” must be generated among academics, industry and regulators.

      This special session was arranged to build upon the plenary presentation by John Colbourne, University of Birmingham, “Towards a Big Data-driven Solution for Cooperative and Effective Management of Chemical Risks,” who encouraged regulatory scientists to be inspired by the scientific enterprises of other disciplines; to work together, think big, and to place scientific ambitions and problem solving ahead of budgetary concerns. The session included four presentations given by Mark Viant (University of Birmingham), representing academia, Stuart Marshall (Unilever), representing industry, Wim De Coen (ECHA), representing regulators, and Georg Streck (European Commission, DG GROW), representing policy makers.

      Viant presented a talk entitled “Exploring the Potential Roles of Metabolomics in Chemical Hazard Assessment,” which included a series of examples that illustrated two of the strengths of “-omics” research, namely how metabolomic profiles can predict apical endpoints of regulatory concern and how they can provide a mechanistic understanding of adverse outcomes. Viant emphasized that we are finally empowered with sufficient know-how to scale up knowledge gathering from typically small-scale “-omics” studies in academia towards industrializing high data quality production, and to manage data processing and storage on a European scale. He made clear that collaboration and coordination with stakeholders in both industry and government will be essential to enhance the usefulness of scientific knowledge and identify regulatory applications for available technologies.

      Marshall then gave his insider perspective on risk assessment and management from the industry viewpoint. Unilever is a consumer goods producer and is progressing the science of toxicity pathways to put in place the tools and novel thinking needed to implement Toxicity Testing in the 21st Century/Adverse Outcome Pathways (TT21C/AOP)-based risk assessments aimed at removing dependence on apical endpoint toxicity studies. Accordingly, research within this industry sector has focused on exploring pathway-based approaches toward mechanistic models of the effects of new chemicals on human health, leveraging recent advances in bioinformatics, in vitro assays, mathematical modeling, high-content “-omics” technologies and systems biology. Examples of pathway-based approaches include screening with sensors, mapping the activities of transcription factors and transducers for major stress response pathways, and the use of sentinel genes or gene sequencing and bioinformatics for the identification of off-target effects. Marshall outlined how these diverse data types could be used in a Weight of Evidence (WoE) framework, enabling chemical read-across and the extrapolation of information across taxa. In his view, this facilitated better environmental and human health protection while facilitating new product development and stewardship.

      The third talk was given by Wim De Coen, a former academic scientist with a strong “-omics” background and who is now working for the European Chemicals Agency (ECHA). De Coen explained how ECHA is bound to the legal requirements of REACH. He perceives the prospect of receiving “-omics” data as part of chemical safety evaluations. REACH allows and encourages the use of alternative testing procedures as long as they are “on par” with standard methods and do not lead to the underestimation of risks. He raised the point that despite great potential, no “-omics” data has yet been submitted to ECHA in any chemical evaluation dossier. He noted that he would have retitled the session “How Can REACH Save ‘-omics’?” so to better focus the utility of “-omics” for regulation. He posed several questions to the scientific community:

      1. How predictive are the “-omics”-based test data of higher levels of biological organization, including tissues, organisms and populations?
      2. How and to what extent can we relate “-omics” to adverse effects?
      3. What is the ecological relevance of “-omics”?

      He summarized that there is the potential in ECHA assessments to review “-omics” data, particularly in support of substance prioritization and arguments based on chemical read-across. But credible use of these technologies will require massive efforts towards validation and standardization. “-Omics” can and should be part of REACH, but the scientific community needs to provide better guidelines for regulators and registrants in interpreting the information. It is also important to consider the costs of “-omics”-based technologies compared with standard tests. There have been several success stories in which “-omics” technologies have been used to elucidate mechanisms of action, but more work is needed to link molecular events to organismal and ecological adverse outcomes. Stakeholders will be convinced by facts not by potential.

      Finally, Georg Streck from the European Commission, DG GROW, presented a talk highlighting that the use of “-omics” data should not focus exclusively on REACH but include other regulatory frameworks. However, a harmonized approach for introducing such methods would be required. An important message was that there are opportunities for “-omics” under REACH as a supportive tool: in a WoE approach; in screening and priority setting; for grouping, category building and read-across; in toxicity assessments providing information on the MoA; and for supporting the selection or deselection of alternatives to hazardous substances. Streck emphasized that the scientific community should bring knowledge and research results to regulators and stakeholders, and we should all strive for internationally accepted assessment schemes and harmonized methods (e.g., via OECD) and possibly standardization. He stressed the importance of information on limitations and uncertainties linked to “-omics” methods, and of a comprehensive documentation of studies performed.

      Following the four presentations, there was a round table discussion among the chairs, presenters and session attendees. Perhaps surprisingly, the principal conclusion reached by all four sectors (academia, industry, regulators and legislators) was that they envisioned an important role for the “-omics” technologies to improve decision-making in risk assessment and that the path can be relatively straightforward, as long as communication among the four represented groups improves towards greater coordination and transparency. In particular, the scientific community needs to increase its effort to understand the needs of regulators and legislators, while the regulators and legislators would benefit from robust and validated case studies.

      Activities that are now under consideration include:

      1. The formation of an advisory group representing industry, academia, regulators and legislators within SETAC to enhance community organization and to serve as an open forum for sharing ideas and perspectives
      2. A Memorandum of Understanding (“Barcelona Road Map” describing the future use of “-omics” in regulatory decision-making) will be drafted by the participants in this special session and distributed to stakeholders for input. This roadmap is an attempt to explicitly define the scientific product (a “Knowledge Base”) that is a publically owned, indispensable tool for predictive toxicology and effective risk management. This roadmap has the important additional function of identifying the direction and the means of achieving our shared desired outcome.
      3. A SETAC workshop will be planned for developing coordinated case studies that provide proof-of-concepts for clear areas of application described in the roadmap, which includes experimental designs for using “-omics” in regulatory testing procedures to be proposed and debated; and common substances will be identified for coordinated assessment and validation activities.

      These three actions are proposed to improve coordination and collaboration between stakeholders, enabling more rapid adoption of alternative and supplementary testing procedures based on “-omics” technologies.

      At the end of the session, the community was asked to participate in this effort and more than 50 people from industry, academia and government showed interest in working together toward improved health and environmental protection through “-omics” science.

      Author’s contact information:

  • Sustainability of Mediterranean Olive Oil Production
    • Sabine E Apitz (SEA Environmental Decisions, Ltd.)

      The value of sustainability, ecosystem services and footprint concepts have been demonstrated in many applications, but practitioners of these fields do not always collaborate as closely as they should. The Ecosystem Services and Sustainability Advisory Groups have co-organized sessions on the Sustainability of Whisky and Chocolate, and now Mediterranean Olive Oil Production. European olive oil represents about 80% of worldwide production; in 2007 there were 1.9 million farms with olive groves in the EU concentrated in Spain, Italy and Greece. In small farms, olive oil production may be a secondary, traditional and family activity, but high incomes can be linked to larger-scale olive groves. The sector, with a long tradition as part of the Mediterranean culture, religion, cuisine and landscape, is being modernized in some areas by intensifying production and at times by extending production into new areas. These changes have a range of impacts on the sustainability of olive oil production and of the landscapes within which it is produced.

      To provide context for discussions, this session began with two platform presentations; the first provided an overview from a primarily ecosystem services (ES) perspective, and the second introduced the application of life cycle assessment (LCA) concepts to olive oil production. Guided discussions then addressed how these two perspectives could be integrated in support of a more sustainable production. Discussions were lively, with a significant portion of the audience taking part. There was broad agreement that, along with the focus provisioning services, traditional olive oil production provides key cultural and supporting ES. Though these can be degraded during intensification, they are not well addressed in traditional LCA approaches, which only focus on negatives (emissions, consumption) without effectively addressing the positives such as landscape, biodiversity and cultural heritage. Common Agricultural Policy (CAP) subsidies focused on production made these issues worse. Unless carefully designed, LCA will always give the highest ratings to the most productive orchards, failing to value the “externalities” of traditional olive growing; ES-focused approaches are sometimes not quantitative enough. There is a need to either develop hybrid LCA tools or to complement them with broader ES-focused analyses to identify and communicate trade-offs. One suggestion was to consider different functional units in LCA such as price as a proxy for quality (and thus a more traditional and less processed product) rather than liters of oil, which in turn might help remove the bias towards highly intensive approaches. Recognizing that footprint assessments are oversimplifications, it was suggested that there was a need to develop and integrate tools that allow us to also consider the positives to evaluate whether landscapes are truly meeting society’s desires and needs (hand printing). Resisting the temptation to be too reductionist, it is necessary to recognize that these questions are post-normative, that there are no right answers (though there may be wrong ones), and that using science to help us envision what we need from landscapes is a collaborative process well outside the comfort zone of many.

      To sustain ecosystems and economies, olive oil production and policy should:

      1. Encourage the good (traditional olive ecosystems enhance biodiversity, ensure genetic diversity and resilience, and are highly valued by society, but they are less competitive and require economic support)
      2. Improve the bad (ecological infrastructure can improve biodiversity, management can control erosion, sustainable management controls wastes, inputs and outputs, even in more intensive systems)
      3. Stop incentivising the ugly (abandonment of traditional groves and extending production into natural habitats should not be encouraged; poor management leads to erosion, desertification, fire risk and eutrophication)

      Tools must be broader to inform this balance. A number of participants agreed to continue to discuss these issues in support of a short paper; brainstorming on how ES-informed LCA can help us better manage this iconic European product and landscape.

      Authors contact information:
  • Towards Realistic and Landscape-Based Prospective Ecological Risk Assessments: Mapping Variability and Diversity
    • Mark Egsmose, Franz Streissl and Jose V. Tarazona (European Food Safety Authority)

      Scientific knowledge and regulatory requirements have progressed immensely, and today more information is available on ecological, biological and physiological variability. Knowledge on different stressors and how they interact is growing. Challenges exist on what to protect in a heavily modified agroecosystem.

      There is a high complexity in defining what is an “environmental harm” and what are environmental values to be protected. What are acceptable levels of changes in locations and over time? Therefore, a dialogue among risk assessors and risk managers is needed for defining the Specific Protection Goals and how to present the risk assessment outcome.

      What can be attributed to natural vs. anthropogenic changes (spatial and temporal), and what are the expected consequences of these changes? The purpose of this special session was to raise awareness of these challenges and to provide examples on landscape-based effects and exposure assessment approaches that are already available.

      A high interest for the session was expressed through the lively and interactive discussion among the 200 participants.

      The session covered views from different experts and stakeholders from the European Commission – Joint Research Centre (JRC), an EU member state, industry and the European Food Safety Authority (EFSA), followed by an interactive plenary discussion.

       The key points for the five presentations included:

      1. Jose Tarazona (EFSA) introduced the session by setting the scene on how to move to landscape-based assessments ensuring that risk assessment supports risk management
      2. Landscape-based perspective in integrated environmental assessment was presented by Serenella Sala (JRC). The JRC science hub is available in support of mapping environmental and ecological variability across Europe. Examples were presented on how integrated environmental landscape assessments with land use modeling platforms (LUISA) can be applied, e.g. for life cycle assessments of substances.
      3. Linking the assessment of pesticides under Regulation (EC) 1107/2009 and the information collection and assessment requirements under the Directive 2009/128/EC on sustainable use of pesticides was presented by Veroniqué Poulsen (Anses). Landscape risk assessment could be a way forward for pesticide authorization, for example to be used in refined risk assessments and to evaluate and propose risk mitigation measures. Landscape modeling may address parts of uncertainty related to multiple use of plant protection products. The level of complexity of approaches needs to be discussed with risk assessors to be applicable for regulatory use.
      4. An industry view on landscape risk assessment, scientific needs, examples of landscape risk assessment, state of the art and the way forward was presented by Anne Alix (ECPA). Industry sees the improved equity in the outcome of risk assessment and decision-making when applied to landscape approaches. However, concerns were raised that more conservatism is introduced compared to lower tiers because of uncertainties.
      5. Franz Streissl (EFSA) presented “Towards realistic and landscape-based prospective ecological risk assessments: Mapping variability and diversity.” Examples of scientific opinions and guidance from EFSA, where tools and methodology are already available, were presented. The benefit of using landscape approaches for identifying vulnerable areas where risk mitigation may be needed was highlighted.

      Conclusions from the Session
      The session made clear that some important aspects need consideration when moving to landscape approaches.
      Landscape risk assessment is a way forward for risk assessment for pesticide refinement, mitigation measures, and it can also address parts of uncertainty. For example, one can locate species in the landscape and mitigate their exposure to pesticides, refine a risk assessment, and estimate the effectiveness of risk mitigation measures identifying areas that may be at risk. It was emphasised that having robust and validated methods and models are key to support the landscape-level risk assessment approaches. Practical examples showed that information and scientific tools are already available.

      Tools and data are more advanced on exposure aspects and geographical data, compared with species populations, ecology and ecosystem services.

      For a progressive transition to landscape risk assessment the landscape scale(s) need to be defined and the level of complexity needs to be discussed. Commitment from data holders to collect and make data and information available, is essential. Many data, models and maps are available. Collective efforts are needed to move from data to knowledge.

      There is a need to look at the approaches and tools available and discuss them for what they bring to risk assessment and to support decision-making of risk managers.

      Authors’ contact information:, and

      *This paper presents a summary of the session according to the author’s views and does not necessarily represent the position of EFSA.

Return to Overview

Contact SETAC Globe
Contact the SETAC Europe office
Contact the SETAC Europe office