13. Technology and Conservation

In October 1992, Julian Simon and Norman Myers had an historic debate at Columbia University on “Scarcity or Abundance” (Myers and Simon, 1994). While Myers, then identified by some as a “doom-sayer”, argued that environmental indicators were all heading in the wrong direction and that ultimately people would pay the price, Simon applauded growth in human populations and asserted that more people provided more minds to develop technological solutions for dealing with coming challenges. This argument of the power of technology to overcome human impacts on our world is still raging (Ehrlich and Ehrlich, 2008) but evidence is accumulating to show that Myers may have got it right back in 1992. Still, it is worth asking whether technology can help us avoid the worst implications of the Malthusian prognosis of humanity outstripping Earth's carrying capacity.

Communities throughout the world have developed their own technologies over thousands of years. These traditional technologies have been overtaken by more modern forms, but the traditional technologies may still have much to offer (Klee, 1980; Gadgil and Berkes, 1991). Many of these traditional technologies are based on biomimicry, and can be improved through incorporation of some modern elements. IUCN's Commission on Environmental Economics and Social Policy (CEESP) has widely promoted such approaches. These are increasingly entering mainstream development thinking, and offer considerable potential as part of green economies.

Much modern technology has contributed to more comprehensive exploitation of natural resources and unanticipated side effects that have caused some of today's most intractable environmental challenges. However, as Simon would argue, new technologies may also be the basis for some of the solutions to those challenges.

WHICH TECHNOLOGIES AND WHAT IMPACT?

From an environmental perspective, some key technologies that have both helped and hindered environmental conservation include information management technology (IT), biotechnology and geo-engineering, and energy technology (Chapter 8).

Information technology

At the time of the 1992 Earth Summit in Rio de Janeiro, nobody had a mobile phone, the internet was not yet operational, and laptops were better considered portable desktop computers. In little more than 15 years, IT has made remarkable advances, resulting in both costs and benefits to biodiversity.

The costs of IT advances can be calculated both in terms of the impacts of increased access to information as well as the impacts of developing and delivering technology to support that access. More accessible information has made it easier for those seeking to exploit nature to identify where valuable resources are and where potential markets might be. The raw materials to provide the computers and mobile phones with which we gather and share our knowledge, and how we dispose of them when a new model hits the market, can also have significant negative impacts on the environment. Exploration and extraction for raw materials such as coltan have already had devastating impacts on biodiversity in places such as the Democractic Republic of Congo. Making and operating computers and mobile phones is an energy and water-intensive exercise with resulting impacts on climate.

“Communities throughout the world have developed their own technologies over thousands of years.”

Using IT products consumes huge amounts of energy, including both the electricity to run personal computers as well as the needs of server farms and other IT infrastructure that keeps the internet running. Annual energy consumption of computers varies from 52 to 482 kWh and for monitors ranges from 22 to 754 kWh (Bray, 2006), with differences depending on specifications and age of the computers and monitors being tested. By comparison, the average annual unit energy consumption for refrigerators in the United States was 1,239 kWh (http://www.eia.doe.gov/emeu/reps/enduse/er01_us.html#Electricity).

Finally, when obsolete computers are discarded, the lead, mercury and other toxic substances used in their manufacture can cause serious pollution problems. The scale of this waste is immense. In 2005, used electronic equipment amounted to about two million tonnes of waste, most of it disposed of in landfills. In the United Kingdom alone, 1,700 mobile phones are thrown away every hour, 15 million every year. Their heavy metals and other pollutants like mercury, lead, cadmium, and brominated flame retardants are left to pollute the soils. Much of the electronic hardware cast aside by industrialized countries goes to poor countries in Africa or Asia that have ineffective environmental policies. On the other hand, recycling mobile phones reduces greenhouse gas (GHG) emissions, keeps valuable material out of landfills and incinerators, and conserves natural resources. Recycling just one million mobile phones reduces GHG emissions equal to taking 1,368 cars off the road for a year.

While information and communications technology (ICT) is not especially environmentally-friendly, increasingly it is being mobilized to improve the management of ecosystem services and biodiversity. For example:

IT advances are also being made in terms of the reduced size of the instruments. Many elusive species can now be studied through radio-tracking and tiny transmitters have already been applied to butterflies, indicating the degree of miniaturization that is now possible. Miniature video cameras have been attached to the critically endangered New Caledonian crow, enabling scientists, for the first time, to fully understand the complicated life these intelligent tool-using birds lead. At the other end of the scale, elephants have also been fitted with radio transmitters so that they can be followed by radio-tracking, both for scientific purposes and to help warn farmers when their fields might be raided by hungry pachyderms seeking a free meal.

Advances in IT, and the information that is now available as a result, enable policy makers and conservationists to better manage threatened species and ecosystems. IT is also supporting decision-making in other arenas, especially climate change, by helping to assess its real impacts by, for example, comparing the size of glaciers in remote areas, measuring the change in polar ice caps, and remotely taking the temperature of the Earth. IT will also be vital to understanding and monitoring the ecosystem response to measures taken.

The most sophisticated use of IT is being made by geneticists, who, without modern technology, would have little chance of understanding the genetic structure of the many species whose genomes have now been mapped. Dozens of knowledge-sharing genomic databases have now been established, covering everything from rice to rats to zebra fish to humans and even the duck-billed platypus. These model organism databases are providing a highly advanced research tool for scientists, enabling them to leap years ahead in the sophistication of the kind of research questions they are able to answer.

Despite the advances, though, the biggest challenge is in ensuring that more comprehensive knowledge of biodiversity is contributing to effective policy and decision-making. IT can and should help play a pivotal role in addressing this challenge. All indications are that these technological advances will continue to accelerate, providing quick and easy access to an increasingly broad range of important information, ranging from DNA analysis to soil micro-organism richness to calculating the ecological footprint of humanity. All of this provides an opportunity to build a technological future that also helps to enhance significantly the management of biological resources, a marriage of technology and biology that can lead to a more sustainable future.

In addition to the hardware aspects of IT, the means by which we manage and manipulate information is also changing. As computers become more powerful, along the lines predicted by Moore's Law (the storage capacity of microchips will double every 18 months), our ability to explore areas that require extensive and complex computation has also expanded.

One of the limiting factors in projecting impacts on nature is the uncertainty involved – something that has plagued the climate community for many years. New methodologies for integrating uncertainty into calculations and modelling are emerging including the use of “fuzzy numbers” and Bayesian networks. All of these are also now being used in environmental research and management, including the assessments undertaken as part of the IUCN Red List of Threatened Species.

BIOTECHNOLOGY

Biotechnology is closely linked with emerging information management. Biotechnology can be defined as any application of technology to biological systems. It has a long history, stretching back to the use of yeast in baking bread and fermentation in making alcoholic beverages. These historical applications have been joined by more modern ones, including nanotechnology, biomimicry, and genetic modification. Some of these new applications of biotechnology are both powerful and novel, calling for the application of a precautionary approach.

Nanotechnology

Nanotechnology involves working at the atomic scale, roughly one-billionth of a metre in size. At this scale, materials behave in ways that are very different from when they are combined with others to form molecules, compounds, and so forth. Nanoparticles are so small that they can enter cells that are impermeable to larger particles. Hence their use in cosmetics, for example, could carry health implications. Further, nanoparticles have a large surface area relative to their volume, enhancing their chemical and electrical properties and increasing the risk that they could lead to damaging reactions within a cell they have invaded.

While nanoparticles can be produced naturally, for example by volcanoes, engineered nanoparticles are becoming big business. Global investment in nanotechnology in 2005 was US$ 10 billion and this is expected to increase to US$ 1 trillion by 2011–2015 (Navarro et al., 2008). Benefits for people in medicines, electronics and the environment are expected. For example, the ability of nanoparticles to bind with polluting chemicals could reduce the bioavailability of those toxic substances. However, the potential for nanoparticles to have toxic effects, for example, lung irritation, has also been recognized. And the unknowns surrounding the use of nanoparticles are many (Navarro et al., 2008).

The field of nanotechnology is virtually unregulated today, and few, if any, studies have been done about possible impacts on biodiversity. Like any new and powerful technology, nanotechnology should be approached with caution, and the application of the precautionary approach would seem appropriate. Sutherland et al. (2008) included nanotechnology among 25 novel threats facing biodiversity. They recommended that “if use becomes widespread or the structures are incorporated into “near-living” systems, new approaches to risk will be needed”. For its part, the European Commission (EC) has issued a “code of conduct” for nanotechnology (EC, 2008). Its section on sustainability states:

Nanosciences and nanotechnologies research activities should be safe, ethical and contribute to sustainable development serving the sustainability objectives of the Community as well as contributing to the United Nations' Millennium Development Goals. They should not harm or create a biological, physical or moral threat to people, animals, plants or the environment, at present or in the future.

The International Risk Governance Council (IRGC) also notes that while nanotechnology presents great potential benefits it also poses serious risks with significant social, economic, political and ethical implications. The IRGC suggests that because issues raised by nanotechnology are more complex and far-reaching than many other innovations, decision makers need to manage for the uncertainties and risks associated (IRGC, 2007).

Biomimicry

“Biomimicry” is derived from combining the Greek words “bios”, meaning life, with “mimesis”, which means imitation. The word is applied to the applications of models and processes from nature to industrial or agricultural designs to solve human problems. As coined by Janine Benyus (1997), it is an approach that learns from nature, rather than just about nature.

Biomimicry is based on the principle that, through the process of evolution, nature has learned what works, what is appropriate and what is sustainable. Nature includes organisms that fly, occupy the entire globe, maintain appropriate living conditions, and build amazingly complex structures. Nature has developed biodegradable materials like glues produced by mussels that work underwater, silks from spider webs that are stronger than the toughest human-produced products, termite mounds that are able to maintain a constant internal temperature despite external temperatures that go from 40°C during the day to near freezing at night, and the feet of geckos that enable them to cling to a smooth ceiling.

We are already using biomimicry applications in our everyday life. Velcro was inspired by the common burr and the Wright Brothers, in designing the first powered aircraft, were inspired by the wings of birds. Solar panels that are used to power orbiting satellites are unfolded based on patterns learned from the unfolding of leaves from tiny buds, and low-energy modern buildings have been based on the model of a termite's nest. Work on biomimicry is highlighting the role of a new generation of well-adapted technologies, based on nature's design principles, for a sustainable future.

As the value of nature in supporting improved livelihoods through application of biomimicry becomes more common, the intrinsic value of all biodiversity as a living laboratory for future needs is more and more apparent. The rationale for conservation of all nature, as a key risk management strategy for capturing option value, is strongly supported by advances from biomimicry.

Genetically Modified Organisms (GMOs)

Genetically modified organisms are a particularly controversial aspect of modifying genetic diversity. They are becoming increasingly prevalent in many countries and are being used in many sectors, from agriculture to health to energy supplies. IUCN Members have acknowledged this growing trend and, while noting the potential of GMOs to improve livelihoods and promote development, have expressed concern regarding the potential negative impacts of GMOs on food safety and the environment. The concern is reflected in IUCN Resolution WCC 3.007 in which the Union calls for “a moratorium on further environmental releases of GMOs until these can be demonstrated to be safe for biodiversity, and for human and animal health, beyond reasonable doubt”. IUCN Members have also recognized the rapid developments in the fields of genetic technology and have requested ongoing updates on this issue.

Potential negative impacts of GMOs include a reduction in biodiversity, threats to human health, unexpected consequences of gene transfer between plants, and creating pests or weeds that are resistant to controls. The Parties to the Convention on Biological Diversity (CBD) have recognized both the potential benefits and costs of GMOs through the Cartagena Protocol, which promotes informed and cautious use of this technology and works to build capacity in all countries to support the decision-making processes involved. IUCN Members have called for governments to ratify the Cartagena Protocol.

The United Nations organizations responsible for human health and food production have found no evidence to date of negative impacts of GMOs on biodiversity or human health. A 2003 review of research undertaken to assess the environmental impact of transgenic crops concluded that insufficient monitoring and testing had been carried out to make any determination in that regard (Ervin et al., 2003). Though scientists have found little conclusive evidence of direct negative impacts of GMOs on biodiversity or human health, other ethical issues need to be considered. Some organizations share views with those of Via Campesina, a worldwide movement of peasant farmers, who believe that GMO technology poses a serious and immediate threat to the security and livelihoods of peasant farmers (www.viacampesina.org). On the other hand, some farmers in developing countries such as China, India, Argentina and Brazil, welcome GMO crops, especially cotton, soybeans and maize.

Geo-engineering

Geo-engineering is the deliberate modification of the environment to achieve specific outcomes relating to human needs. With respect to climate change, two aspects of geo-engineering are considered: managing solar radiation, for example through creation of solar sulphur aerosols; and managing GHG emissions, for example through carbon capture and storage techniques or employing biochar as a carbon sink (Victor et al., 2009). The side effects of these technologies remain largely unknown. At least one geo-engineering technology, ocean fertilization by iron to promote the growth of carbon-sequestering phytoplankton, has been tested, leading to considerable debate in global environmental policy arenas; governments have agreed a moratorium on further testing of this technology.

Mathews and Caldera (2007), looking specifically at the question of managing solar radiation, reported that while geo-engineering solutions may provide some mitigation, these technologies also masked increases in GHG emissions. Should geo-engineering solutions fail or be stopped abruptly, the result could be very rapid climate change, with warming rates up to 20 times greater than present-day rates. They conclude that simply relying on geo-engineering without complementary efforts to reduce carbon emissions presents high risks for the global climate system.

Synthetic biology

While some consider that synthetic biology is simply an extension of genetic engineering, it is in fact much more complex, involving the engineering of new biological systems, parts, or devices that do not exist in nature, and the re-design or re-engineering of existing biological elements for useful purposes (IRGC, 2008b). While genetic engineering typically involves only one or a few genes at a time, synthetic biology creates entire new organisms or metabolic units. While this technology is still in its infancy, it has been shown to be possible to create viral genomes such as the polio virus (Cello et al., 2002) and to reconstruct the virus that was responsible for the 1918 influenza pandemic (Tumpey et al., 2005).

As an emerging branch of biology, most of the work in this field is far from having any commercial applications. But its advocates see potential in bioremediation (for example, degrading pesticides and removing pollutants), developing bio-sensors that can detect toxic chemicals, developing bacteria or viruses that could identify cancer cells and deliver therapeutic agents where they are required, developing pharmaceuticals more effectively, engineering micro-organisms that can produce new sources of energy, and other applications that are beyond current imagination.

On the other hand, synthetic biology could pose substantial risks, such as the unintended detrimental effects on the environment of the accidental release of synthetic organisms, such as those designed originally for bioremediation. Using synthetic biology to create microorganisms could lead to highly unpredictable effects; in a worst-case scenario, harmful organisms could be deliberately created (though it currently is much easier to obtain pathogens in other ways). At a philosophical level, it is feasible that synthetic biology will lead to most evolution taking place in the laboratory rather than in nature, potentially posing significant risks to the very concept of nature, and to biodiversity.

In 2003, J. Craig Venter and his team of researchers successfully built a fully synthetic chromosome in two weeks. Since that time, the Venter Institute has continued to be at the forefront of synthetic genomic technology to examine and replicate the genetics of life (Smith et al., 2003). In 2008, scientists at the J. Craig Venter Research Institute announced the first completely synthetic bacterial genome (Mycoplasma genitalium), thereby taking a significant step towards artificial life.

The tools for synthetic biology are easily available online in an open-access library, the Registry of Standard Biological Parts (http://parts.mit.edu). Undergraduates are already holding competitions for using “BioBricks” to develop their own synthetic biological devices, though no regulatory measures have yet been put in place to ensure that such experimentation does not threaten the environment (IRGC, 2008). Synthetic biology is a field that certainly calls for the precautionary approach and deserves greater attention from the conservation field than it is currently receiving.

MAKING THE MOST OF TODAY'S TECHNOLOGY WHILE SUPPORTING THE ENVIRONMENT

To meet today's conservation challenges, new technology will be particularly important to provide the means by which to deal with some of the main threats to biodiversity and ecosystem services, such as climate change, pollution, and invasive alien species. In all cases, making the most of technology that is compatible with environmental conservation means that in the coming decade, we will need to support development of tools and information technology that are needed to effectively manage vulnerable ecosystems and to ensure sustainable livelihoods for people living in these areas. In addition, we will need to apply a precautionary approach to manage the many uncertainties about the longer-term impacts of some of these technologies and adopt some fundamental behavioural changes to manage the impacts of consuming these technologies, including paying attention to the 3R's – reduce, recycle, and reuse.

< previous section  < index >  next section >