Download
Download
Download
ENVIRONMENTAL
MONITORING
ENVIRONMENTAL
MONITORING
Ruth McDonald
Environmental Monitoring
by Ruth McDonald
www.tdmebooks.com
All rights reserved. No portion of this book may be reproduced in any form without
permission from the publisher, except as permitted by U.S. copyright law. For permissions
contact:
[email protected]
Published by:
3585 S Vermont Ave,
#7367 Los Angeles, CA, US, 90007
Website: www.tdmebooks.com
Contents
to do so upon contact with substances which are toxic to them. Both naturally
occurring light emitting microorganisms as well as specially developed ones
are used. Positively acting bacterial biosensors have been constructed which
start emitting light upon contact (and subsequent reaction) with a specific
pollutant. In the USA such a light emitting bacterium has been approved for
the detection of polyhalogenated aromatic hydrocarbons in field tests.
Immunoassays use labelled antibodies (complex proteins produced in biological
response to specific agents) and enzymes to measure pollutant levels.
If a pollutant is present, the antibody attaches itself to it; the label making it
detectable either through colour change, fluorescence or radioactivity.
Immunoassays of various types have been developed for the continuous,
automated and inexpensive monitoring of pesticides such as dieldrin and
parathion. The nature of these techniques, the results of which can be as simple
as a colour change, make them particularly suitable for highly sensitive field
testing where the time and large equipment needed for more traditional testing
is impractical. Their use is however limited to pollutants which can trigger
biological antibodies. If the pollutants are too reactive, they will either destroy
the antibody or suppress its activity and so also the effectiveness of the test.
GENETIC ENGINEERING
Recombinant DNA technology has had amazing repercussions in the last
few years. Molecular biologists have mapped entire genomes, many new
medicines have been developed and introduced and agriculturists are
producing plants with novel types of disease resistance that could not be
achieved through conventional breeding. Several of the previously mentioned
examples like the amylose-free potato and the indigo-producing bacterium
also involve the use of organisms genetically modified by recombinant DNA
technology.
Many enzymes are routinely produced by genetically modified organisms
too. Given the overwhelming diversity of species, biomolecules and metabolic
pathways on this planet, genetic engineering can in principle be a very
powerful tool in creating environmentally friendlier alternatives for products
and processes that presently pollute the environment or exhaust its non-
renewable resources. Politics, economics and society will ultimately determine
which scientific possibilities will become reality.
Nowadays organisms can also be supplemented with additional genetic
properties for the biodegradation of specific pollutants if naturally occurring
organisms are not able to do that job properly or not quickly enough. By
4 Environmental Monitoring
LEGISLATION
Regulation to ensure safe application of novel or modified organisms in
the environment is important, not least to maintain public confidence. The
European Union has two Directives on the contained use of genetically
modified micro-organisms, and on the deliberate release of genetically
modified organisms into the environment. These have been implemented in
the national legislation of most EU Member States. They require that a detailed
Environmental Monitoring 5
Environmental biotechnology has a career extending back into the last century.
As the need is better appreciated to move towards less destructive patterns
of economic activity, while maintaining improvement of social conditions in
spite of increasing population, the role of biotechnology grows as a tool for
remediation and environmentally sensitive industry. Already, the technology
has been proven in a number of areas and future developments promise to
widen its scope. Some of the new techniques now under consideration make
use of genetically modified organisms designed to deal efficiently with
specific tasks.
As with all situations where there is to be a release of new technology into
the environment, concerns exist. There is a potential for biotechnology to
make a further major contribution to protection and remediation of the
environment. Hence biotechnology is well positioned to contribute to the
development of a more sustainable society. As we move into the next
millennium this will become even more vitally important as populations,
urbanisation and industrialisation will continue to climb.
production of indigo, the dye which is used for blue jeans, takes eight steps,
the use of very toxic chemicals and special protection measures for the process
operators and the environment. The biotechnological production of indigo,
which uses a genetically modified bacterium containing the right enzymes,
takes only three steps, proceeds in water, uses simple raw materials like sugar
and salts and generates only indigo, carbon dioxide and biomass which is
biodegradable.
AGRICULTURAL AND
ENVIRONMENTAL BIOTECHNOLOGY
Biotechnology refers to the practical application of modern laboratory
techniques such as recombinant DNA. Although most biotechnology research
has been medical, more and more is being undertaken for agricultural and
environmental uses. The term biological control usually refers to solutions
found in nature that can replace synthetic chemicals in the environment.
Since many biological control processes are being improved through
biotechnology, it has become impractical to distinguish between the two.
With proper attention to risk management, both natural and laboratory-
enhanced “green technologies” have the potential to protect our wildlife
heritage and reduce our chemical contamination of the environment.
ENVIRONMENTAL MONITORING
Both heavy metals and persistent organic pollutants (POPs) are capable
of accumulating in plants and animals. As they move up the food chain,
these contaminants often increase in concentration and become more harmful
to wildlife. Many species of bacteria, blue-green algae, mosses, ferns, and
dicots are capable of absorbing POPs and heavy metals. By following the
concentration of contaminants in specific plants, changes in environmental
contamination can be monitored.
PRINCIPAL DIRECTIONS OF
BIOTECHNOLOGICAL DEVELOPMENT
In a recent review of agricultural technology, the Office of Technology
Assessment (OTA) of the US Congress defined biotechnology to include
'any technique that uses living organisms or processes to make or modify
products, to improve plants or animals or to develop micro-organisms for
specific uses-it focuses upon recombinant DNA and cell fusion technologies'.
Longworth concurs with this definition, with the significant addition of
tissue-culture techniques, an aspect of biotechnology which has great
economic, and therefore social and political, potential impact. Despite debates
about whether these or any other definitions are adequate, they do touch on
the main aspects of biotechnology which deserve to be considered.
The aspect of this new biotechnology which most captures the imagination
and stirs the greatest controversy is gene splicing or recombinant DNA
techniques, which inspire researchers to consider the possibilities of producing
reproducible animals and plants markedly different from current existing
species, and referred to as transgenic species. Already in higher animals and
plants recombinant DNA has produced transgenic forms which are being
commercially exploited.
10 Environmental Monitoring
subsidies have been required to maintain the Brazilian and USA ethanol
programmes. Longworth states, however, that a new biotechnology,
Sucrotech, is being patented, which will not only reduce the cost of producing
ethanol from sugar cane, but will simultaneously produce fructose at a cost
which will be competitive with HFCS and may thereby reclaim part of the
sweetener market for sugar cane.
That such possibilities are in prospect indicates how volatile the future
might be; biotechnology has tilted the competitive balance from sugar cane
to maize, causing economic pressure and even disruption to sugar-cane-
dependent economies and may in future switch it back again. In the long run
the capacity to ferment fuels microbially from renewable agricultural
feedstocks points to an important long-term reorientation of agriculture if
non-renewable oil becomes uncompetitively expensive as a fuel for cars and
feedstock for certain chemicals.
It suggests that eventually there will be an increased emphasis on
agricultural production of industrial feedstocks at the same time as continually
increasing food output will be required to feed the expanding world
population. All this will require considerable increases in agricultural
productivity, to which biotechnology will increasingly contribute, but it will
at the same time impose great strains on the natural environment and the
structure of agriculture.
REGULATION OF
AGRICULTURAL BIOTECHNOLOGY
For traditionally bred plants, regulators rely on plant breeders to conduct
appropriate safety testing and to be the first line of defence against genetic
alterations that might prove dangerous. The same should be true for
agricultural products developed using biotechnology. There is no reason for
regulators to treat biotech plants any differently than traditionally bred plants,
particularly given the fact that biotechnology provides greater control over
gene manipulation.
Despite fears about gene manipulation, traditional crossbreeding has altered
plant genes and improved the human diet for the past 30 years. In fact, people
worldwide safely consume these plants every day. Examples include wheat,
corn, potatoes, tomatoes, and countless other staples of the American diet. In
1984, the biotechnology industry began experimenting with “gene-spliced
plants” — the more advanced approach to gene manipulation that we now
call “biotechnology.”
USDA regulates the release of all genetically engineered agricultural plants
under statutes giving USDA’s Animal and Plant Health Inspection Service
(APHIS) the authority to regulate plants that may be or may become weeds
or other nuisances — what the statutes call “plant pests.” Although the rules
apply in a general sense to novel or exotic varieties of both gene-spliced and
conventional plants, APHIS typically only requires field testing of
conventional plants that are new to a particular U.S. ecosystem (transplanted
from another continent, for example). However, all genetically engineered
agricultural plants face a higher regulatory bar — making it more difficult to
expand the use of biotechnology.
Genetically engineered crops must be field tested under APHIS
regulations prior to commercialisation. For example, a new variety of corn
produced with conventional hybridisation requires no government-
mandated field testing, but all new varieties of genetically engineered corn
do, even though there is no logical reason for the regulatory disparity. For
most genetically engineered plants, APHIS requires the company producing
16 Environmental Monitoring
the plants to submit notice detailing the gene or genes that have been inserted,
where the plant will be tested, and other relevant characteristics of the plant
prior to receiving permission to conduct the field trials. Once the company
completes field testing, APHIS reviews the results and makes a determination
on whether or not the product should be “deregulated” and can be released
into the market.
REGULATORY SCHEME
At the time, the White House Office of Science and Technology Policy
began crafting a framework wherein existing federal agencies would regulate
genetically engineered organisms on the basis of their characteristics, not
the method of production — thus wisely deferring to the scientific
community’s judgement that regulation ought to address the products, not
the process of biotechnology.
Engineered organisms would not require extra scrutiny simply because
genetic engineering produced them (the process). Instead, they would be
subject to heightened scrutiny only if the individual organisms expressed
characteristics that posed some conceptually heightened risk (the product).
The federal government divided regulatory jurisdiction among agencies
already involved in agricultural, food, and environmental regulation.
These include the United States Department of Agriculture (USDA), the
Environmental Protection Agency (EPA), and the Food and Drug
Administration (FDA). While each of these agencies considers the
characteristics of individual products in their regulation, only FDA followed
the general scientific thinking that genetically engineered and non-genetically
engineered products should be regulated similarly. Both USDA and EPA
automatically subject all genetically engineered plants as a class to premarket
approval requirements not ordinarily applied to conventionally bred plants.
LABELLING ISSUES
Some activists, however, argue that the government should mandate the
labelling of all genetically engineered foods. They assert that consumers have
a “right to know” how their foods have been altered, and that a mandatory
label would best allow consumers to choose between genetically engineered
and conventional foods. Biotechnology advocates have argued against
mandatory labelling because such requirements raise food costs— something
that mostly harms lower-income Americans and people on fixed budgets.
Perhaps more important, while biotech products are not substantially
different from other products, special labels would likely make consumers
think these products were more dangerous. Hence, rather than serving
educational or “right to know” purposes, such labels promise to simply
confuse consumers.
A government-mandated label on all genetically engineered foods also
would raise important First Amendment free speech issues. In 1996, the U.S.
Court of Appeals, in International Dairy Foods Association et al. v. Amestoy,
ruled unconstitutional a Vermont statute requiring the labelling of dairy
products derived from cows treated with a genetically engineered growth
hormone, noting that food labelling cannot be mandated simply because some
people would like to have the information. “Absent … some indication that
Environmental Monitoring 19
AUTHORITY
The US FDA is responsible for ensuring the safety and wholesomeness of
all food and food components, including the products of rDNA technology,
under the FFDCA. The US FDA has the authority for the immediate removal
of any product from the market that poses potential risk to public health or
that is being sold without all necessary regulatory approvals. As a result, a
legal burden is placed on developers and food manufacturers to ensure the
commodities utilised and foods available to consumers are safe and in
compliance with all legal requirements of the FFDCA.
In order to understand the regulatory approach followed by the US FDA
in the safety evaluation of GM crops, it is useful to consider food and food
safety from a historical context. People had been consuming foods derived
from agricultural crops for many years prior to the existence of any food
laws or regulations within the United States.
Based on this experience, agricultural crops have been accepted as being
safe for consumption as food, without additional testing to demonstrate
their safety. As long as the new crop variety has exhibited similar agronomic
22 Environmental Monitoring
ROLE
Consistent with recommendations within the Coordinated Framework,
the US FDA considered existing provisions of the FFDCA to be sufficient
for the regulation of foods and food components developed using rDNA
technology. It was concluded that the scientific and regulatory issues posed
by the products of rDNA technology were not significantly different from
those posed by conventional products.
As a result, GM foods and food components have been subject to the
same standards of safety as already exist for the regulation of other foods
and food components under the FFDCA. In order to better communicate
interpretations of existing provisions of the FFDCA as they relate to the
safety evaluation of foods derived from new plant varieties, including
the products of rDNA technology, the US FDA released a policy statement
in 1992 entitled 'Statement of policy: foods derived from new plant
varieties'.
The US FDA has considered the use of genetic modification (i.e. rDNA
technology) in the development of new plant varieties to represent a
continuum of conventional plant breeding practices (e.g. mutagenesis,
hybridisation, protoplast fusion, etc.), and as a result, the safety evaluation
of all new plant varieties, not just those developed using rDNA technology,
have been evaluated based on an objective analysis of the characteristics of
a food or its components, and not on its method of production.
Environmental Monitoring 23
PRODUCT CHARACTERISATION
Product characterisation takes into consideration information relating to
the modified food crop, the introduced genetic material and its expression
product, and acceptable levels of inherent plant toxicants and nutrients. All
characteristics of the gene insert must be known, including the source(s),
size, number of insertion sites, promoter regions, and marker sequences. It
must be established that the transferred genetic material does not come from
a pathogenic source, a known source of allergens, or a known toxicant-
producing source. The introduced genetic material should be well chara-
cterised to ensure that the introduced gene sequences do not encode harmful
Environmental Monitoring 25
substances and are stably inserted within the plant genome to minimize any
potential opportunity for undesired genetic rearrangement. Analytical data
are required to evaluate the nutritional composition, the levels of any known
toxicants, anti-nutritional and allergenic substances, and the safety-in-use of
antibiotic resistance marker genes.
Any new substances introduced into crops through rDNA technology (e.g.
proteins, fatty acids, carbohydrates) will be subject to pre-market review as
food additives by the US FDA, unless substantially similar to substances
already safely consumed as a part of foods, or that are considered GRAS. To
date, substances that have been added to foods through rDNA technology
have been previously consumed or have been determined to be substantially
similar to substances already consumed as a part of the diet.
As such, introduced substances have been considered exempt from the
requirement for pre-market approval as food additives with the US FDA. A
more rigourous safety evaluation of a GM crop is warranted if the introduced
gene sequence(s) has not been fully characterised, the nutritional composition
has been significantly altered, antibiotic resistance marker genes have been
used during its development, or if an allergenic protein or toxicant has been
detected at levels higher than what is typically observed in edible varieties of
the same crop species.
In any event, determinations as to the safety of substances that have been
introduced into new plant varieties through rDNA technology are made on a
case-by-case basis. Although an evaluation of the introduced gene sequences
and expression product(s) provides assurance as to their safety, further studies
may be required to predict whether unexpected effects may result following
their interaction with other genes within the plant.
In addition, the product characterisation of a GM plant involves assessing
sequence homology to known toxicants and allergens, thermal and digestive
stability, and if required, the results of both in vitro and in vivo assays to
demonstrate lack of toxicity.
COMPOSITIONAL ANALYSIS
The results of field trials performed over several years serve to characterise
the phenotypic and agronomic characteristics exhibited by the plant (e.g.
height, colour, leaf orientation, susceptibility to disease, root strength, vigour,
fruit or grain size, yield, etc.), as well as to provide the materials required for
the compositional analysis. Any anomalies in the phenotypic or agronomic
characteristics exhibited by a plant may result in a requirement for additional
26 Environmental Monitoring
information. Protein, fat, fibre, starch, amino acid, fatty acid, ash and sugar
levels are determined, as well as the levels of anti-nutrients, natural toxicants
or known allergens.
Studies of the nutritional composition are performed to determine whether
the levels of any key nutrients, vitamins or minerals have been altered as a
result of the genetic modification. Based on the results of these studies, a
determination is made as to whether the phenotypic and agronomic
characteristics of a GM crop or the concentrations of inherent constituents
fall within ranges typical of its conventional counterpart.
If inserting a new gene causes no change in any of the assessed parameters,
the US FDA can conclude with reasonable assurance that the GM crop is as
safe as the conventional crop. If the levels of essential nutrients or inherent
toxicants are found to be significantly different in the GM crop, the US FDA
may recommend additional action prior to commercialisation, such as
obtaining food additive status, or the use of specific labels to alert consumers
of an altered nutritional content, etc.
ALLERGENICITY
In consultation with scientific experts in the areas of food safety, food
allergy, immunology, biotechnology and diagnostics, the US FDA published
guidelines for assessing the allergenicity of GM foods or food components
in 1994. The approach to assessment is multi-faceted, incorporating data
regarding the origin of the genetic material, and the biochemical,
immunological and physicochemical properties of the expressed protein.
The overall assessment is reliant upon the fact that all known food allergens
are proteins and, notwithstanding the number of shared properties between
allergenic and non-allergenic food proteins, food allergens tend to exhibit a
number of similar characteristics. In general, food allergens share a number
of common properties: they have a molecular weight of over 10 000 Da; they
represent more than 1 per cent of the total protein content of the food; they
demonstrate resistance to heat, acid treatment, proteolysis and digestion; and
they are recognised by IgE.
For gene sequences derived from known allergenic sources (e.g. peanuts),
the developers of GM plants are expected to demonstrate that allergenic proteins
have not been introduced into the food. For assessment purposes, it is assumed
that any genetic material derived from a known allergenic source will encode
for an allergen. To demonstrate otherwise, the amino acid sequence of an
expressed protein must be compared with that of known allergens using protein
Environmental Monitoring 27
ANTIBIOTIC RESISTANCE
The use of antibiotic-resistance genes as selectable markers has been
common practice in the development of new plant varieties using rDNA
technology. Concerns relate to the potential transfer of antibiotic-resistance
genes from GM plants to pathogens in the environment or to the gut of humans
consuming foods or food components. Issues relating to the use of antibiotic
resistance genes were identified in the 1992 US FDA Statement of Policy
Statement as well as discussed in additional guidance entitled Guidance for
Industry: Use of Antibiotic Resistance Marker Genes in Transgenic Plants.
The guidance provided within these documents was established in
consultation with experts in the fields of microbiology, medicine, food safety,
bacterial and mycotic diseases, and includes suggestions with respect to the
continued safe use of antibiotic-resistance marker genes by the developers
of new plant varieties.
The use of marker genes that encode resistance to clinically important
antibiotics has raised questions as to whether their presence in food could
reduce the effectiveness of oral doses of the antibiotic or whether the gene
present in the DNA could be transferred to pathogenic microbes, rendering
them resistant to treatment with the antibiotic.
The risk of transfer of antibiotic-resistance genes from plants to
microorganisms considered to be pathogenic to humans, however, is
considered to be minimal if not insignificant. Furthermore, the potential risks
are becoming less of a concern as more developers are beginning to research
the use of alternative technologies (e.g. non-resistance-based markers) in
Environmental Monitoring 29
plant breeding. The conclusions with respect to the safe use of antibiotic-
resistance marker genes are consistent with the findings of other national
and international food safety organisations.
LABELLING
The FFDCA defines what information must be disclosed to consumers on
a food label, such as the common or usual name, and other limitations
concerning the representations or claims that can be made or suggested about
a food product. All foods must be labelled truthfully and not be misleading
to consumers. Taking this into consideration, the FFDCA does not stipulate
the disclosure of information on the basis of consumer desire to know.
Labelling may be considered misleading if it fails to reveal material facts in
light of representations that are made with respect to a product.
The labelling of foods derived from new plant varieties, including plants
developed using rDNA technology, was originally addressed in the 1992 US
FDA Statement of Policy, and most recently discussed in Draft Guidance for
Industry for the voluntary labelling of bioengineered foods. To date, the US
FDA is not aware of any information that would distinguish foods developed
using rDNA technology (e.g. bioengineered foods) as a class from foods
developed through other methods of conventional plant breeding, and as
such, have not considered the method of development a material fact requiring
disclosure on product labels. Nevertheless, after extensive consultation,
including thousands of written comments and a series of public meetings,
the US FDA has observed 'a general agreement that providing more
information to consumers about bioengineered foods would be useful'.
VOLUNTARY LABELLING
To provide guiding principles for voluntary labelling, in recognition of
the desire of certain manufacturers to label foods as produced either with or
without bioengineering, the US FDA published a 'Draft Guidance for Industry:
voluntary labelling indicating whether the foods have or have not been
developed using bioengineering'. Emphasising that the use of rDNA
technology 'is not a material fact', the US FDA recognises that some consumers
want disclosure of bioengineered content and that some manufacturers wish
to provide it. In response, Draft Guidance was issued with suggestions
concerning the use of labelling statements that are not considered misleading.
EVALUATION OF GENETICALLY
MODIFIED FOOD CROPS
Genetic modification, otherwise referred to as recombinant DNA (rDNA)
technology or gene-splicing, has proven to be a more precise, predictable
and better understood method for the manipulation of genetic material than
previously attained through conventional plant breeding. To date, agricultural
applications of the technology have involved the insertion of genes for
desirable agronomic traits (e.g. herbicide tolerance, insect resistance) into a
variety of crop plants, and from a variety of biological sources.
Examples include soybeans modified with gene sequences from a
Streptomyces species encoding enzymes that confer herbicide tolerance, and
corn plants modified to express the insecticidal protein of an indigenous soil
microorganism, Bacillus thuringiensis (Bt). A growing body of evidence
32 Environmental Monitoring
suggests that the technology may be used to make enhancements to not only
the agronomic properties, but the food, nutritional, industrial and medicinal
attributes of genetically modified (GM) crops.
Regulatory supervision of rDNA technology and its products has been in
place for a longer period of time in the United States than in most other parts
of the world. The methods and approaches established to evaluate the safety
of products developed using rDNA technology continue to evolve in response
to the increasing availability of new scientific information.
As our understanding of the potential applications of the technology is
broadened, the safety of products developed using rDNA technology and the
potential effects of introduced gene sequences on human health or the
environment will be more closely scrutinized. In fact, much of the knowledge
acquired during the commercialisation of the products of rDNA technology
in agriculture is now finding application in evaluating the safety of products
developed through more conventional means.
The objective of this chapter is to provide the reader with an overview of
the significant events leading up to the present, science-based, regulatory
framework that exists for the safety evaluation of GM food crops within the
United States. An attempt has been made to discuss concerns over the
sufficiency of existing regulations, as well as to highlight recent initiatives
taken by federal regulatory agencies to address them. Through better
communication of how the regulatory process functions within the United
States, it is anticipated that current and future applications of rDNA technology
in agriculture will be met with a greater level of understanding and acceptance.
HISTORICAL PERSPECTIVE
rDNA technology was first developed in the 1970s. The initial response
of the scientific community, including members of the National Academy of
Science (NAS), to the prospects of rDNA technology, was to postpone any
further research involving the technology until the potential risks to human
health and the environment could be evaluated.
Researchers attending the International Conference on Recombinant DNA
Molecules in 1975, otherwise known as the Asilomar Conference, tried to
establish a scientific consensus on how best to self-regulate emerging
applications of the technology. The conditions and restrictions that were
proposed at this conference have formed the basis by which federal guidelines
and policies for rDNA technology research were drafted within the United
States.
Environmental Monitoring 33
environment. Although all the case studies were published, OSTP/CEQ failed
to reach a consensus on issues relating to the relevant strengths and
weaknesses of the existing regulatory structure within the time allotted for
the completion of its review. A review of the case studies published provides
a comprehensive interpretation of the responsibilities of each federal
regulatory agency in ensuring the safety of the products developed using
rDNA technology.
However, the NRC report included requests for federal regulatory agencies
to further strengthen the current regulatory approval process through better
coordination and communication between agencies, on-going investment in
the research and monitoring of potential human health (e.g. allergenicity)
and environmental impacts (e.g. insect resistance), and by providing greater
access to information evaluated in support of regulatory decisions.
MODERN AGRICULTURAL
BIOTECHNOLOGY
In 1973, Cohen and Boyer transferred a gene from one organism into
another. In 1982, the first biotech plant, an antibiotic resistant tobacco, was
developed. In January 1983, at a meeting of genetic researchers in Miami,
three different teams reported success in using Agrobacterium tumefaciens,
a bacterium, to carry new genes into plant cells, heralding the dawn of modern
agricultural biotechnology. Agrobacterium tumefaciens, described as a
“natural genetic engineer”, splices its own genes into host plant cells. This
pathogenic bacterium was now converted into a pack mule, to carry new,
foreign genes into plant cells, minus the disease and this became the most
common means of producing Genetically Engineered Organisms (GEOs).
Field tests for GE crops resistant to pests and pathogens were first
conducted in the US, in 1985. A co-coordinated framework for the regulation
of GEOs was established and the first GE tobacco was released, in 1986 .
The US Department of Agriculture published guidelines for field trials of
GE crops in 1991. On approval from the Food and Drug Administration,
Flavour Saver, the first GE tomato, with a longer shelf life, was on the US
markets in 1994. During 1995-96, GE soybean, corn and cotton were approved
for commercialisation, in the US.
A number of GE crops, developed for pest, pathogen and herbicide
resistance, are now commercially cultivated in several countries. Rice with
pro-vitamin A, higher iron content, or human milk proteins and potatoes
with high protein content are in various stages of development. Tobacco
plants producing functional human haemoglobin and bacteria that produce
human insulin have been developed, as well as plants with vaccines against
rabies and other viral diseases. Food grain crops that withstand drought and
salinity are high priority research, and so are those for high yield. A GE
tobacco plant detoxifies soils contaminated by explosive residues, providing
solution to a frustrating environmental problem in countries ravaged by armed
38 Environmental Monitoring
conflict. Now more than 70 biotech agricultural crops that have been approved
for use in North America, including varieties of soybeans, cotton, canola,
corn, potatoes, squash, tomatoes and papaya.
About six million farmers in some 17 countries now cultivate GE crop on
about 125 million acres, a 30 fold increase over 1996. By end of the year
2002, six GE crops planted in the US (soybeans, corn, cotton, papaya, squash
and canola) produced an additional four billion pounds of food and fibre on
the same acreage, improved farm income by US $ 1.5 billion and reduced
pesticide use by 46 million pounds.
In 2003 in the US, 80 per cent of soybean acres will be planted with biotech
varieties. A number of activist groups vehemently attack GE technology and
its products on grounds of safety to humans and the environment, and costs
of technology transfer and its reach to the needy. Products of agricultural
biotechnology bear the brunt of this ill-informed, unscientific and prejudiced
onslaught much more than GE products related to health or industry.
In 2001, the European Community released results of a 15-year study,
costing US $ 64 million, and involving more than 400 research teams and 81
projects. This report concluded that GE products pose no more risk to human
health or the environment than conventional crops. So far, extensive and
intensive research on the probable risks of GE technology has not brought
out any adverse effects and non-e of the fears expressed by anti-tech activists
were proved even marginally. Nevertheless, caution and examining issues
case by case, is the watch word of technologists, who are aware of their
responsibilities.
For a number of products, technology transfer is free of costs for developing
countries, as for example Golden Rice, the rice with pro-vitamin A. It is the
responsibility of the Governments of the respective countries to bear the
costs of developing local varieties and to reach the products to the needy at
an affordable cost. In India, three GE varieties of pest resistant cotton were
approved for commercialisation, a year ago. Approval for other GE varieties
of cotton and GE mustard was deferred twice by the Genetic Engineering
Approval Committee (GEAC), the highest authority on the issue. In India,
the level of public awareness of the realisable benefits and probable risks of
GE products is abysmally low.
The functioning of the GEAC leaves much to be desired. Taking advantage
of this hazy situation, mixing up economic, social and political issues with
science, and even using such vagaries of nature as the severe drought, several
groups of vested interest have created mistrust, confusion and scare. Trashing
Environmental Monitoring 39
a very promising technology this way does not augur well for the future of
Indian agriculture. This results only in denying the benefits of technology to
the farmers and consumers.
THE FUTURE OF
AGRICULTURAL BIOTECHNOLOGY
There are divergent views about the rate of uptake of new agricultural
biotechnologies and their impact. Kalter and Tauer and the US Office of
Technology Assessment anticipate comparatively rapid take-up, while others
(Buckwell and Moxey and Farrington do not foresee much impact until the
next century, that is until ten to fifteen years have elapsed. The caution of the
latter commentators is probably justified, for there are various hurdles to be
jumped before biotechnologies can gain acceptance. One hurdle to be
overcome is the economic one.
Self-evidently there must be some economic advantage to farmers or agro-
industry from adopting the technology. Frequently what captures the scientific
imagination turns out to be economically unviable in use, although it is only
with use that efficiency can be improved and costs brought down. A more
demanding test still for biotechnologies will be to obtain legislative approval
and public acceptance. These are intimately connected. In the case of
machinery developments, provided safety standards are observed there are
no legal impediments to developing new and improved machines, since there
are no obvious problems of hazard to the public and therefore public
acceptance.
Chemical insecticide, fungicides and herbicides do, however, pose much
greater problems because of concerns with toxicity, and elabourate testing
and licensing procedures have evolved after a relatively haphazard set of
procedures in the 1950s and 1960s, when widespread use was made of DDT
and organo-phosphorous compounds without adequate recognition of their
toxicity. In significant part, because of the problems which have been
experienced with agro-chemicals, the licensing of biotechnologies will
inevitably be based on testing at least as stringent as that which now exists
for chemicals.
Indeed the licensing procedures are likely to be more stringent because of
public and scientific concerns. Public acceptance of biotechnology in the
food chain will be made more difficult by confusion about differences between
biotechnologies. In the early 1980s there was a crescendo of concern about
40 Environmental Monitoring
the use of steroids to promote faster liveweight gain in calves and cattle. The
public outcry eventually led to the banning of such hormones for meat animals;
but the legacy is a profound distrust of and hostility to new products such as
synthetic Bovine Somatrophin (BST), which has the capacity to increase the
milk yield of those dairy cows which are 'relatively' deficient in what is a
naturally occurring hormone.
Although initial scientific results are favourable, and some minor doubts
remain, the major influence leading the European Commission to impose a
moratorium on the use of BST until the end of 1990 at least is concern about
public perception; the Commission has stated 'It would be a serious setback
to producers and to the Community's milk policy were present [positive]
trends in consumption to be reversed as a result of adverse consumer reaction.
With newly bioengineered plants and animals scientific and public concerns
are emerging which are of a different order from those associated with
previous biotechnology.
One relates to the possibility that crop failures may become more frequent
if biotechnology leads to a reduction in genetic diversity in crops being grown;
such a tendency has already been observed in relation to the Green Revolution
technology for wheat. There are also concerns about upsetting the balance of
nature in unforseen ways, akin to the unanticipated consequence of
introducing rabbits into Australia and then trying to control these by
introducing myxomatosis. Thus questions arise such as what happens if
herbicide resistance introduced into a commercial crop plant transfers itself
by cross pollination into a closely related weed species, or indeed if the
herbicide-resistant crop should colonize wild habitats.
There are, of course, more lurid and absurd notions bandied about in the
popular press which are similar in nature to the attacks made on Darwinism.
The questions, both absurd and real, raised in relation to agricultural
biotechnology will undoubtedly slow the rate at which it is adopted.
Nevertheless instances of adoption are increasing: BST in the USA, a
bioengineered baker's yeast in the UK, bioengineered sheep producing insulin,
etc. As these become more widespread and increasingly affect lower-valued,
bulk agricultural products, so the impacts of biotechnology on structural
change will increase. The balance of economic power will switch increasingly
to industry, to high-technology large farms and against smaller farmers in
disadvantaged regions and countries. That, unfortunately, is a seemingly
inevitable consequence of what we consider to be economic progress.
Environmental Monitoring 41
Process
The ATS process generally consists of the following phases:
• Pre-treatment stage to remove large solids and other undesireable
substances from the wastewater.
• Aeration stage, where the aerobic bacteria digest the biological
wastes in the wastewater.
• Settling stage to allow any undigested solids to settle. This forms
a sludge which must be periocally removed from the system.
44 Environmental Monitoring
Process Description
Aerobic systems treat wastewater using natural processes that require
oxygen. Bacteria that thrive in oxygen-rich environments work to break down
and digest the wastewater inside the aerobic treatment unit.
Like most onsite systems, aerobic systems treat the wastewater in stages.
Sometimes the wastewater receives pretreatment before it enters the aerobic
Environmental Monitoring 45
unit, and the treated wastewater leaving the unit requires additional treatment
or disinfection before being returned to the environment. Such a variety of
designs exists for home aerobic units and systems that it is impossible to
describe a typical system. Instead, it is more practical to discuss how some
common design features of aerobic systems work and the different stages of
aerobic treatment.
Pretreatment
Some aerobic systems include a pretreatment step to reduce the amount
of solids in the wastewater going into the aerobic unit. Solids include greases,
oils, toilet paper, and other materials that are put down the drain or flushed
into the system. Too much solid material can clog the unit and prevent
effective treatment. Some pretreatment methods include a septic tank, a
primary settling compartment in the pretreatment unit, or a trash trap.
Pretreatment is optional but can greatly improve a unit’s performance.
excess solids can settle. Other designs allow the sludge to accumulate at the
bottom of the tank.
In aerobic units designed with a separate settling compartment, the sludge
returns to the aeration chamber (either by gravity or by a pumping device).
The sludge contains bacterial that also aid in the treatment process. Although,
in theory, the aerobic treatment process should eventually be able to consume
the sludge completely, in practice, the sludge does build up in most units and
will need to be pumped out periodically so that solids don’t clog the unit.
An alternative design for aerobic treatment is the attached growth system.
These units treat wastewater by taking a surface made of material that the
bacteria can attach to, and then exposing that surface alternately to wastewater
and air. This is done either by rotating the surface in and out of the wastewater
or by dosing the wastewater onto the surface. Pretreatment is required. The
air needed for the process either is naturally present or is supplied
mechanically.
Attached growth systems, such as trickling filters and rotating disks, are
less common than suspended growth systems, but have certain advantages.
For example, there is no need for mixing, and solids are less likely to be
washed out of the system during periods of heavy household water use.
Flow Design
The way and the rate in which wastewater is received by and flows through
the aerobic unit differs from design to design. Continuous flow designs simply
allow the wastewater to flow through the unit at the same rate that it leaves
the home. Other designs employ devices (such as pretreatment tanks, surge
chambers, and baffles) to control the amount of the incoming flow. Batch
process designs use pumps or siphons to control the amount of wastewater
in the aeration tank and/or to discharge the treated wastewater in controlled
amounts after a certain period.
Controlling the flow of wastewater helps to protect the treatment process.
When too much wastewater is flushed into the system all at once, it can
become overburdened, and the quality of treatment can suffer. The
disadvantages to mechanical flow control devices are that, like all mechanical
components, they need maintenance and run the risk of malfunctioning.
Disinfection
Some units have the disinfection process incorporated into the unit design.
In some cases, disinfection may be the only treatment required of the
Environmental Monitoring 47
wastewater from an aerobic unit before the water is released into the
environment. Added costs for disinfectants, such as chlorine, should be taken
into account with aerobic units.
Design Criteria
Aerobic units should be large enough to allow enough time for the solids
to settle and for the wastewater to be treated. The daily wastewater volume
is usually determined by the number of bedrooms in the house. Flows in
Arizona for individual homes are up to 140 gallons per day (gpd) per bedroom.
The needed size of an aerobic unit is often estimated the same way the
size of a septic tank is estimated, by the number of bedrooms (not bathrooms)
in the house. It is assumed that each person will use approximately 50 to
100 gallons of water per day, and that each bedroom can accommodate two
people. When calculated this way, a three-bedroom house will require a unit
with a capacity of 300 to 600 gallons per day.
Some health departments require that aerobic units be sized at least as
large as a septic tank in case the aerobic unit malfunctions and oxygen doesn’t
mix with the wastewater. In such cases, the aerobic unit will work as a septic
tank, which will, at least, provide partial treatment for the wastewater.
Lower temperatures tend to slow down most biological processes, and
higher temperatures tend to speed them up. The aerobic process itself creates
heat, which, along with the heat from the electrical components, may help to
keep the treatment process active. However, cold weather can have adverse
effects on the performance of aerobic units.
In one 1977 study of aerobic units, bulking of the sludge seemed to occur
when the temperature of the mixed liquor fell below 15 degrees Celsius (59
degrees Fahrenheit). Problems can sometimes be avoided by insulating
around the units.
and note the appearance of the wastewater inside the unit and its color and
odor. If the unit includes a chlorinator, this too will need to be checked and
may need cleaning. Samples may be taken of the mixed liquor from the
aeration chamber, as well as the final treated wastewater. Check to see that
all mechanical parts, alarms, and controls are in working order, and that
solids are pumped from the system if needed.
It is important that mechanical components in aerobic systems receive
regular inspection and maintenance. For example, air compressors sometimes
need to be oiled, and vanes, filters, and seals may need to be replaced.
Malfunctions are common during the first few months after installation. In
most cases, homeowners do not have the expertise to inspect, repair, and
maintain their own systems.
Most aerobic units have controls that can be switched on and off by the
homeowner in case of emergency. Aerobic units also are required to have
alarms to alert the homeowner of malfunctions. Depending on the design of
the system, controls and alarms can be located either inside or outside the
home, and alarms can be visible, audible, or both.
Homeowners should make sure that controls and alarms are always
protected from corrosion, and that the aerobic unit is turned back on if here
is a power outage or if it is turned off temporarily.
To assure homeowners are receiving a reputable ATU most states require
the system to by approved by the NSF International (formerly the National
Sanitation Foundation). NSF has tested aerobic units according to the
requirements of ANSI/NSF Standard 40. NSF is a nonprofit organization
devoted to the protection of the environment through the development of
product standards, product evaluations, research, education and training.
The American National Standards Institute (ANSI) is the recognized
accredited in the U.S. for organizations that develop consumer standards
and for those that provide independent product evaluations. NSF is accredited
by ANSI for both of these areas of service.
Anaerobic Digestion
Anaerobic digestion (AD) is the harnessed and contained, naturally
occurring process of anaerobic decomposition. An anaerobic digester is an
industrial system that harnesses these natural process to treat waste, produce
biogas that can be used to power electricity generators, provide heat and
produce soil improving material. Anaerobic digesters have been around for a
long time and they are commonly used for sewage treatment or for managing
Environmental Monitoring 49
Generating Electricity
When micro-organisms consume a substrate such as sugar in aerobic
conditions they produce carbon dioxide and water, however when oxygen is
not present they produce carbon dioxide, protons and electrons as described
below:
C12H22O11 + 13H2O → 12CO2 + 48H+ + 48e-
Microbial fuel cells use inorganic mediators to tap into the electron
transport chain of cells and steal these electrons that are produced. The
mediator crosses the outer cell lipid membranes and plasma wall; it then
begins to liberate electrons from the electron transport chain that would
normally be taken up by oxygen or other intermediates. The now reduced
mediator exits the cell laden with electrons that it shuttles to an electrode
where it deposits them; this electrode becomes the electro-generic anode
(negatively charged electrode). The release of the electrons means that the
mediator returns to its original oxidised state ready to repeat the process. It is
important to note that this can only happen under anaerobic conditions, if
oxygen is present then it will collect all the electrons as it has a greater
Environmental Monitoring 51
Uses
Power Generation
Microbial fuel cells have a number of potential uses. The first and most
obvious is harvesting the electricity produced for a power source. Virtually
any organic material could be used to ‘feed’ the fuel cell. MFCs could be
installed to waste water treatment plants. The bacteria would consume waste
material from the water and produce supplementary power for the plant. The
52 Environmental Monitoring
gains to be made from doing this are that MFCs are a very clean and efficient
method of energy production. A fuel cell’s emissions are well below
regulations. MFCs also use energy much more efficiently than standard
combustion engines which are limited by the Carnot Cycle, In theory a MFC
is capable of energy efficiency far beyond 50%(17).
However MFCs do not have to be used on a large scale, it has even been
suggested that MFCs could be implanted in the body to be employed as a
power source for a pacemaker, a microsensor or a microactuator. The MFC
would take glucose from the blood stream or possibly other substrates
contained in the body and use this to generate electricity to power these
devices. The advantages to using a MFC in this situation as opposed to a
normal battery is that it uses a renewable form of energy and would not need
to be recharged like a standard battery would. Further to this they could also
be built very small and they operate well in mild conditions, 20oC to 40oC
and also at pH of around 7(19).
Further Uses
Using the electricity from the fuel cells can be harnessed in applications
for EcoBots, Gastrobots and Biosensors. Since the current generated from a
microbial fuel cell is directly proportional to the strength of wastewater used
as the fuel, an MFC can be used to measure the strength of wastewater.The
strength of wastewater is commonly evaluated as biochemical oxygen demand
(BOD) values. BOD values are determiined incubating samples for 5 days
with proper source of microbes, usually activate sludge collected from sewage
works. When BOD values are used as a real time control parameter, 5 days’
incubation is too long.
An MFC-type BOD sensor can be used to measure real time BOD values.
Oxygen and nitrate are preferred electron acceptors over the electrode reducing
current generation from an MFC. An MFC-type BOD sensors underestimate
BOD values in the presence of these electron acceptors. This can be avoided
inhibiting aerobic and nitrate respirations in the MFC using terminal oxydase
inhibitors such as cyanide and azide. Improvement of a microbial fuel cell
performance as a BOD sensor using respiratory inhibitors.This type of BOD
sensor is commercially available.
the type of waste and the area, a level of processing may follow collection.
This processing may be to reduce the hazard of the waste, recover material
for recycling, produce energy from the waste, or reduce it in volume for
more efficient disposal.
Collection methods vary widely between different countries and regions,
and it would be impossible to describe them all. For example, in Australia
most urban domestic households have a 240 litre (63.4 gallon) bin that is
emptied weekly by the local council. Many areas, especially those in less
developed areas, do not have a formal waste-collection system in place.
In Canadian urban centres curbside collection is the most common method
of disposal, whereby the city collects waste, and or recyclables, and or organics
on a scheduled basis from residential areas. In rural areas people dispose of
their waste at transfer stations. Waste collected is then transported to a regional
landfill.
Disposal methods also vary widely. In Australia, the most common method
of disposal of solid waste is to landfills, because it is a large country with a
low-density population. By contrast, in Japan it is more common for waste
to be incinerated, because the country is smaller and land is scarce.
Landfill
Disposing of waste in a landfill is the most traditional method of waste
disposal, and it remains a common practice in most countries. Historically,
landfills were often established in disused quarries, mining voids or borrow
pits. Running a landfill that minimises environmental problems can be a
hygienic and relatively inexpensive method of disposing of waste materials;
however, a more efficient method of disposal will almost surely be needed in
time as less land becomes available for such purposes.
Older or poorly managed landfills can create a number of adverse
environmental impacts, including wind-blown litter, attraction of vermin and
pollutants such as leachate, which can leach into and pollute groundwater
and rivers. Another product of landfills containing harmful wastes is landfill
gas, mostly composed of methane and carbon dioxide, which is produced as
the waste breaks down anaerobically.
Characteristics of a modern landfill include methods to contain leachate,
such as lining clay or plastic liners. Disposed waste should be compacted
and covered to prevent attracting mice and rats and preventing wind-blown
litter. Many landfills also have a landfill gas extraction system installed after
closure to extract the gas generated by the decomposing waste materials.
54 Environmental Monitoring
This gas is often burnt in a gas engine to generate electricity. Even flaring the
gas off is a better environmental outcome than allowing it to escape to the
atmosphere, as this consumes the methane, which is a far stronger greenhouse
gas than carbon dioxide. Some of it can be tapped for use as a fuel.
Many local authorities, especially in urban areas, have found it difficult to
establish new landfills due to opposition from owners of adjacent land. Few
people want a landfill in their local neighborhood. As a result, solid waste
disposal in these areas has become more expensive as material must be
transported further away for disposal.
Some oppose the use of landfills in any way, anywhere, arguing that the
logical end result of landfill operations is that it will eventually leave a
drastically polluted planet with no canyons, and no wild space. Some futurists
have stated that landfills will be the “mines of the future”: as some resources
become more scarce, they will become valuable enough that it would be
necessary to ‘mine’ them from landfills where these materials were previously
discarded as valueless.
This fact, as well as growing concern about the impacts of excessive
materials consumption, has given rise to efforts to minimise the amount of
waste sent to landfill in many areas. These efforts include taxing or levying
waste sent to landfill, recycling the materials, converting material to energy,
designing products that require less material, and legislation mandating that
manufacturers are responsible for final packaging and materials disposal costs
(as in the manufacturers setting up and funding the “Grüne Punkt” in Germany
to achieve that end). A related subject is that of industrial ecology, where the
material flows between industries is studied. The by-products of one industry
may be a useful commodity to another, leading to reduced materials
wastestream.
Incineration
Incineration is the process of destroying waste material by burning it.
Incineration is often alternatively named “Energy-from-waste” (EfW) or
“waste-to-energy”; this is misleading as there are other ways of recovering
energy from waste that do not involve directly burning it.
Incineration is carried out both on a small scale by individuals, and on a
large scale by industry. It is recognised as a practical method of disposing of
hazardous waste materials, such as biological medical waste. Many entities
now refer to disposal of wastes by exposure to high temperatures as thermal
treatment (however this also includes gasification and pyrolysis). This concept
Environmental Monitoring 55
The plasma reactor does not discriminate between types of waste. It can
process any type of waste. The only variable is the amount of energy that it
takes to destroy the waste. Consequently, no sorting of waste is necessary
and any type of waste, other than nuclear waste, can be processed.
The reactors are large and operate at a slightly negative pressure, meaning
that the feed system is simplified because the gas does not want to escape.
The gas has to be pulled from the reactor by the suction of the compressor.
Each reactor can process 20 tonnes per hour (t/h) compared to 3 t/h or typical
gasifiers. Because of the size and negative pressure, the feed system can
handle bundles of material up to 1 metre in size. This means that whole
drums or bags of waste can be fed directly into the reactor making the system
ideal for large scale production.
The gas coming out of a plasma gasifier is lower in trace contaminants
than with any kind of incinerator or other gasifier. Because the process starts
with lower emissions out of the reactor, it is able to achieve significantly
lower stack emissions. The gasifier doesn’t care about the amount of moisture
in the waste. The moisture consumes energy to vaporise and can impact the
capacity and economics; however, it will not affect the process.
Gas from the plasma reactor can be burned to produce electricity or can
be synthesised into ethanol to contribute to automotive fuel.
TREATMENT IN THE
RECEIVING ENVIRONMENT
Many processes in a wastewater treatment plant are designed to mimic
the natural treatment processes that occur in the environment, whether that
environment is a natural water body or the ground. If not overloaded, bacteria
in the environment will consume organic contaminants, although this will
reduce the levels of oxygen in the water and may significantly change the
overall ecology of the receiving water. Native bacterial populations feed on
the organic contaminants, and the numbers of disease-causing microorganisms
are reduced by natural environmental conditions such as predation, exposure
to ultraviolet radiation, etc.
Consequently, in cases where the receiving environment provides a high
level of dilution, a high degree of wastewater treatment may not be required.
However, recent evidence has demonstrated that very low levels of certain
contaminants in wastewater, including hormones (from animal husbandry
and residue from human birth control pills) and synthetic materials such as
Environmental Monitoring 59
countries, such as diarrhea, typhus and cholera, are caused primarily by poor
hygiene practices and poor disposal of wastewater. The public health impact
of the discharge of untreated wastewater is comparatively much lower.
Hygiene promotion, on-site sanitation and low-cost sanitation thus are likely
to have a much greater impact on public health than wastewater treatment.
Given the scarcity of financial resources in developing countries and the
poor track record of wastewater treatment plants, it could thus be argued that
investments should first be undertaken to evacuate wastewater from human
settlements and to promote good hygiene practices.
Only once this has been achieved, substantial funds should be invested in
wastewater treatment plants. However, national legislation modeled on
standards in the US and the EU has often led to the priorization of expensive
wastewater treatment without having much of an environmental impact.In
the year 2000, the United Nations has established that 2.64 billion people
had inadequate access to sanitation, adequate sanitation being defined as
access to an improved latrine, a septic tank or a sewer.
This value represented 44 percent of the global population, but in Africa
and Asia approximately half of the population had no access whatsoever to
sanitation. There are few reliable figures on the share of the wastewater
collected in sewers that is being treated in developing countries. However,
in Latin America about 15% of collected wastewater passes through treatment
plants (with varying levels of actual treatment) and in Sub-Saharan Africa
almost none of the collectecd wastewater is treated.
Biological Reactions
Aerobic bacteria need nitrogen and phosphate, as well as carbon and
oxygen, to live. Both elements are widespread in nature and known to be
important ingredients in any kind of manure. The optimum ratio of the
elements carbon, nitrogen and phosphorus in the nutrition of bacteria has
been determined as 100:5:1.CompostBiological activity similar to that in
waste water treatment can be observed in the production of compost. Organic
waste is eaten by micro-organisms, producing heat that stimulates them to
eat more, reproduce and die. Death of the biomass is the final stage in the
process and is what makes compost a good fertiliser.
WASTEWATER TREATMENT
Domestic Wastewater treatment is the process of removing contaminants
from sewage. It includes physical, chemical and biological processes to
remove physical, chemical and biological contaminants. Its objective is to
produce a wastestream (or treated effluent) and a solid waste or sludge also
suitable for discharge or reuse back into the environment. This material is
often inadvertently contaminated with toxic organic and inorganic
62 Environmental Monitoring
surface water sewers in the UK. Overflows from foul sewers designed to
relieve pressure from heavy rainfall are termed storm sewers or combined
sewer overflows.As rainfall runs over the surface of roofs and the ground, it
may pick up various contaminants including soil particles (sediment), heavy
metals, organic compounds, animal waste, and oil and grease. Some
jurisdictions require stormwater to receive some level of treatment before
being discharged directly into waterways. Examples of treatment processes
used for stormwater include sedimentation basins, wetlands, and vortex
separators (to remove coarse solids).The site where the process is conducted
is called a sewage treatment plant.
The flow scheme of a sewage treatment plant is generally the same for all
countries:
Mechanical Treatment; Influx (Influent), Removal of large objects,
Removal of sand and grit Pre-precipitation
Biological Treatment; Oxidation bed (oxidizing bed) or Aerated
systemPost precipitation Effluent
Chemical Treatment (this step is usually combined with settling and other
processes to remove solids, such as filtration. The combination is referred to
in the US as physical-chemical treatment.).
Treatment Stages
Primary treatmentPrimary treatment is to reduce oils, grease, fats, sand,
grit, and coarse (settleable) solids. This step is done entirely with machinery,
hence the name mechanical treatment. Influx (influent) and removal of large
objectsIn the mechanical treatment, the influx (influent) of sewage water is
strained to remove all large objects that are deposited in the sewer system,
such as rags, sticks, condoms, sanitary towels (sanitary napkins) or tampons,
cans, fruit, etc. This is most commonly done with a manual or automated
mechanically raked screen. This type of waste is removed because it can
damage the sensitive equipment in the sewage treatment plant.
Sedimentation
Many plants have a sedimentation stage where the sewage is allowed to
pass slowly through large tanks, commonly called “primary clarifiers” or
“primary sedimentation tanks”. The tanks are large enough that faecal solids
can settle and floating material such as grease and plastics can rise to the
surface and be skimmed off. The main purpose of the primary stage is to
produce a generally homogeneous liquid capable of being treated biologically
and a sludge that can be separately treated or processed. Primary settlement
tanks are usually equipped with mechanically driven scrapers that continually
drive the collected sludge towards a hopper in the base of the tank from
where it can be pumped to further sludge treatment stages.
Secondary Treatment
Secondary treatment is designed to substantially degrade the biological
content of the sewage such as are derived from human waste, food waste,
soaps and detergent. The majority of municipal and industrial plants treat the
settled sewage liquor using aerobic biological processes. For this to be
effective, the biota require both oxygen and a substrate on which to live.
There are number of ways in which this is done. In all these methods, the
bacteria and protozoa consume biodegradable soluble organic contaminants
(e.g. sugars, fats, organic short-chain carbon molecules, etc.) and bind much
of the less soluble fractions into floc particles. Secondary treatment systems
are classified as fixed film or suspended growth. In fixed film systems - such
as rock filters - the biomass grows on media and the sewage passes over its
Environmental Monitoring 65
Roughing Filters
Roughing filters are intended to treat particularly strong or variable organic
loads, typically industrial, to allow them to then be treated by conventional
secondary treatment processes. They are typically tall, circular filters filled
with open synthetic filter media to which sewage is applied at a relatively
high rate. The design of the filters allows high hydraulic loading and a high
flow-through of air. On larger installations, air is forced through the media
using blowers. The resultant liquor is usually within the normal range for
conventional treatment processes.
Activated Sludge
Activated sludge plants use a variety of mechanisms and processes to use
dissolved oxygen to promote the growth of biological floc that substantially
removes organic material. It also traps particulate material and can, under
ideal conditions, convert ammonia to nitrite and nitrate ultimately to nitrogen
gas, (see also denitrification).Filter Beds (Oxidising beds)In older plants and
plants receiving more variable loads, trickling filter beds are used where the
settled sewage liquor is spread onto the surface of a deep bed made up of
coke (carbonised coal), limestone chips or specially fabricated plastic media.
Such media must have high surface areas to support the biofilms that form.
The liquor is distributed through perforated rotating arms radiating from a
central pivot.
The distributed liquor trickles through this bed and is collected in drains
at the base. These drains also provide a source of air which percolates up
through the bed, keeping it aerobic. Biological films of bacteria, protozoa
and fungi form on the media’s surfaces and eat or otherwise reduce the
organic content. This biofilm is grazed by insect larvae and worms which
help maintain an optimal thickness. Overloading of beds increases the
thickness of the film leading to clogging of the filter media and ponding on
the surface.
66 Environmental Monitoring
Secondary Sedimentation
The final step in the secondary treatment stage is to settle out the biological
floc or filter material and produce sewage water containing very low levels
of organic material and suspended matter.
Tertiary Treatment
Tertiary treatment provides a final stage to raise the effluent quality to the
standard required before it is discharged to the receiving environment (sea,
river, lake, ground, etc.) More than one tertiary treatment process may be
used at any treatment plant. If disinfection is practiced, it is always the final
process. It is also called Effluent polishing.
Environmental Monitoring 67
Filtration
Sand filtration removes much of the residual suspended matter. Filtration
over activated carbon removes residual toxins. LagooningLagooning provides
settlement and further biological improvement through storage in large man-
made ponds or lagoons. These lagoons are highly aerobic and colonization
by native macrophytes, especially reeds, is often encouraged. Small filter
feeding invertebrates such as Daphnia and species of Rotifera greatly assist
in treatment by removing fine particulates.
68 Environmental Monitoring
Other heavy metals. These contaminants come from mining waste and
tailings, landfills, or hazardous waste dumps.
Chlorinated solvents. Metal and plastic effluents, fabric cleaning,
electronic and aircraft manufacturing are often discharged and contaminate
groundwater.
DISEASES
Water-borne diseases are infectious diseases spread primarily through
contaminated water. Though these diseases are spread either directly or
through flies or filth, water is the chief medium for spread of these diseases
and hence they are termed as water-borne diseases. Most intestinal (enteric)
diseases are infectious and are transmitted through faecal waste. Pathogens
– which include virus, bacteria, protozoa, and parasitic worms – are disease-
producing agents found in the faeces of infected persons. These diseases are
more prevalent in areas with poor sanitary conditions.
These pathogens travel through water sources and interfuses directly
through persons handling food and water. Since these diseases are highly
infectious, extreme care and hygiene should be maintained by people looking
after an infected patient. Hepatitis, cholera, dysentery, and typhoid are the
more common water-borne diseases that affect large populations in the tropical
regions. A large number of chemicals that either exist naturally in the land or
are added due to human activity dissolve in the water, thereby contaminating
it and leading to various diseases.
Pesticides. The organophosphates and the carbonates present in pesticides
affect and damage the nervous system and can cause cancer. Some of the
pesticides contain carcinogens that exceed recommended levels. They contain
chlorides that cause reproductive and endocrinal damage.
Lead. Lead is hazardous to health as it accumulates in the body and affects
the central nervous system. Children and pregnant women are most at risk.
Fluoride. Excess fluorides can cause yellowing of the teeth and damage
to the spinal cord and other crippling diseases.
Nitrates. Drinking water that gets contaminated with nitrates can prove
fatal especially to infants that drink formula milk as it restricts the amount of
oxygen that reaches the brain causing the ‘blue baby’ syndrome. It is also
linked to digestive tract cancers. It causes algae to bloom resulting in
eutrophication in surface water.
Petrochemicals. Benzene and other petrochemicals can cause cancer even
at low exposure levels.
72 Environmental Monitoring
PREVENTIVE MEASURES
Water-borne epidemics and health hazards in the aquatic environment are
mainly due to improper management of water resources. Proper management
of water resources has become the need of the hour as this would ultimately
lead to a cleaner and healthier environment. In order to prevent the spread of
water-borne infectious diseases, people should take adequate precautions. The
city water supply should be properly checked and necessary steps taken to
disinfect it. Water pipes should be regularly checked for leaks and cracks. At
home, the water should be boiled, filtered, or other methods and necessary
steps taken to ensure that it is free from infection.
ladder. They were first asked how much they were willing to pay to maintain
national water quality in the boatable level. Subsequent questions asked them
their willingness to pay for overall water quality to fishable quality and
swimmable quality. The average willingness-to-pay amounts given by the
respondent for the two higher levels consists of the amounts they offered for
the lower levels plus any additional amount they offered for the higher level.
The average annual amounts per household for those respondents who
answered the willingness-to-pay questions turned out to be:
The most substantial benefit is for boatable water. The respondents are
willing to give about 20 percent more for fishable water than boatable water,
but only an additional 15 percent to make the water swimmable. The data
also permitted one to make a rough distinction between the type of recreation
and the intrinsic values discussed earlier. Since the willingness-to-pay
questions measure the overall value that respondents have for water quality,
the amount given by each respondent represents the combination of
recreational and intrinsic values held by that person. But it was possible to
tell from the questions whether a person actually engaged in water-based
recreation.
It was reasoned that the values expressed by the respondents who do not
engage in in-stream recreation should be almost purely intrinsic in nature. In
calculating the average willingness-to-pay amount for the nonrecreationists
alone, therefore, we get an approximation of the intrinsic value of water
quality. By subtracting this amount from the total the recreationists are willing
to pay, one can estimate, in a rough way, the portions of the recreationists’
benefits which are attributable to recreation and intrinsic values.
When this is done, it is found that intrinsic value constitutes about 45
percent of the total value for recreationists, 100 percent for the
nonrecreationists (by assumption), and about 55 percent for the sample as a
whole. If this is a correct reflection of reality, it is a major finding and may
have large implications for the future study of benefits from environmental
improvement. It was noted earlier that, while the sample of persons
interviewed was initially chosen at random, quite a few respondents failed to
give usable answers. Any aggregate national benefit estimate based on these
data therefore could not be put forward as accurate. Thus, I make such an
estimate simply to illustrate that the results of this experiment imply very
large values.
There are about 80 million households in the United States. Assume that
the sample results imply that to have high-quality recreational waters
Environmental Monitoring 77
PESTICIDE MONITORIN
IN SURFACE WATER
Monitoring data for pesticides are generally poor in much of the world
and especially in developing countries. Key pesticides are included in the
monitoring schedule of most western countries, however the cost of analysis
and the necessity to sample at critical times of the year (linked to periods of
pesticide use) often preclude development of an extensive data set. Many
developing countries have difficulty carrying out organic chemical analysis
due to problems of inadequate facilities, impure reagents and financial
constraints. New techniques using immunoassay procedures for presence/
absence of specific pesticides may reduce costs and increase reliability.
Immunoassay tests are available for triazines, acid amides, carbamates, 2,4-
D/phenoxy acid, paraquot and aldrin
Data on pesticide residues in fish for lipophilic compounds and
determination of exposure and/or impact of fish to lipophobic pesticides
78 Environmental Monitoring
The effects of pollution on soil are quite alarming and can cause huge
disturbances in the ecological balance and health of living creatures on earth.
Some of the most serious soil pollution effects are mentioned below:
• Decrease in soil fertility and therefore decrease in the soil yield.
Definitely, how can one expect a contaminated soil to produce healthy
crops?
• Loss of soil and natural nutrients present in it. Plants also would not
thrive in such a soil, which would further result in soil erosion.
• Disturbance in the balance of flora and fauna residing in the soil.
• Increase in salinity of the soil, which therefore makes it unfit for
vegetation, thus making it useless and barren.
• Generally crops cannot grow and flourish in a polluted soil. Yet if
some crops manage to grow, then those would be poisonous enough
to cause serious health problems in people consuming them.
• Creation of toxic dust leading is another potential effect of soil
pollution.
• Foul smell due to industrial chemicals and gases might result in
headaches, fatigue, nausea, etc. in many people.
• Soil pollutants would bring in alteration in the soil structure, which
would lead to death of many essential organisms in it. This would
also affect the larger predators and compel them to move to other
places, once they lose their food supply.
Environmental Monitoring 81
PESTICIDES
The term “pesticide” is a composite term that includes all chemicals that
are used to kill or control pests. In agriculture, this includes herbicides (weeds),
insecticides (insects), fungicides (fungi), nematocides (nematodes), and
rodenticides (vertebrate poisons). A fundamental contributor to the Green
Revolution has been the development and application of pesticides for the
control of a wide variety of insectivorous and herbaceous pests that would
otherwise diminish the quantity and quality of food produce. The use of
pesticides coincides with the “chemical age” which has transformed society
since the 1950s.
In areas where intensive monoculture is practised, pesticides were used
as a standard method for pest control. Unfortunately, with the benefits of
chemistry have also come disbenefits, some so serious that they now threaten
the long-term survival of major ecosystems by disruption of predator-prey
relationships and loss of biodiversity. Also, pesticides can have significant
human health consequences. While agricultural use of chemicals is restricted
to a limited number of compounds, agriculture is one of the few activities
where chemicals are intentionally released into the environment because they
kill things.
Agricultural use of pesticides is a subset of the larger spectrum of industrial
chemicals used in modern society. The American Chemical Society database
indicates that there were some 13 million chemicals identified in 1993 with
some 500 000 new compounds being added annually. In the Great Lakes of
North America, for example, the International Joint Commission has estimated
that there are more than 200 chemicals of concern in water and sediments of
the Great Lakes ecosystem. Because the environmental burden of toxic
chemicals includes both agriculture and non-agricultural compounds, it is
difficult to separate the ecological and human health effects of pesticides
from those of industrial compounds that are intentionally or accidentally
released into the environment.
However, there is overwhelming evidence that agricultural use of pesticides
has a major impact on water quality and leads to serious environmental
consequences. Although the number of pesticides in use is very large, the
largest usage tends to be associated with a small number of pesticide products.
In a recent survey in the agricultural western provinces of Canada where
some fifty pesticides are in common use, 95% of the total pesticide application
is from nine separate herbicides. Although pesticide use is low to nil in
traditional and subsistence farming in Africa and Asia, environmental, public
82 Environmental Monitoring
effect that does not cause death over the test period but which causes
observable effects in the test organism such as cancers and tumours,
reproductive failure, growth inhibition, teratogenic effects, etc.).
• Persistence: Measured as half-life (time required for the ambient
concentration to decrease by 50%). Persistence is determined by biotic
and abiotic degradational processes. Biotic processes are
biodegradation and metabolism; abiotic processes are mainly
hydrolysis, photolysis, and oxidation. Modern pesticides tend to have
short half lives that reflect the period over which the pest needs to be
controlled.
• Degradates: The degradational process may lead to formation of
“degradates” which may have greater, equal or lesser toxicity than
the parent compound. As an example, DDT degrades to DDD and
DDE.
• Fate (Environmental): The environmental fate (behaviour) of a pesticide
is affected by the natural affinity of the chemical for one of four
environmental compartments: solid matter (mineral matter and
particulate organic carbon), liquid (solubility in surface and soil water),
gaseous form (volatilization), and biota. This behaviour is often referred
to as “partitioning” and involves, respectively, the determination of:
the soil sorption coefficient (KOC); solubility; Henry’s Constant (H);
and the n-octanol/water partition coefficient (KOW). These parameters
are well known for pesticides and are used to predict the environmental
fate of the pesticide.
An additional factor can be the presence of impurities in the pesticide
formulation but that are not part of the active ingredient. A recent example is
the case of TFM, a lampricide used in tributaries of the Great Lakes for
many years for the control of the sea lamprey. Although the environmental
fate of TFM has been well known for many years, recent research by
Munkittrick et al. has found that TFM formulation includes one or more
highly potent impurities that impact on the hormonal system of fish and
cause liver disease.
Bioconcentration
This is the movement of a chemical from the surrounding medium into
an organism. The primary “sink” for some pesticides is fatty tissue (“lipids”).
Some pesticides, such as DDT, are “lipophilic”, meaning that they are
soluble in, and accumulate in, fatty tissue such as edible fish tissue and
human fatty tissue. Other pesticides such as glyphosate are metabolized
and excreted.
86 Environmental Monitoring
Biomagnification
This term describes the increasing concentration of a chemical as food energy
is transformed within the food chain. As smaller organisms are eaten by larger
organisms, the concentration of pesticides and other chemicals are increasingly
magnified in tissue and other organs. Very high concentrations can be observed
in top predators, including man. The ecological effects of pesticides (and other
organic contaminants) are varied and are often inter-related. Effects at the
organism or ecological level are usually considered to be an early warning
indicator of potential human health impacts. The major types of effects are
listed below and will vary depending on the organism under investigation and
the type of pesticide. Different pesticides have markedly different effects on
aquatic life which makes generalization very difficult. The important point is
that many of these effects are chronic (not lethal), are often not noticed by
casual observers, yet have consequences for the entire food chain.
• Death of the organism.
• Cancers, tumours and lesions on fish and animals.
• Reproductive inhibition or failure.
• Suppression of immune system.
• Disruption of endocrine (hormonal) system.
• Cellular and DNA damage.
• Teratogenic effects (physical deformities such as hooked beaks on
birds).
• Poor fish health marked by low red to white blood cell ratio, excessive
slime on fish scales and gills, etc.
• Intergenerational effects (effects are not apparent until subsequent
generations of the organism).
• Other physiological effects such as egg shell thinning.
These effects are not necessarily caused solely by exposure to pesticides
or other organic contaminants, but may be associated with a combination of
environmental stresses such as eutrophication and pathogens. These
associated stresses need not be large to have a synergistic effect with organic
micro pollutants. Ecological effects of pesticides extend beyond individual
organisms and can extend to ecosystems. Swedish work indicates that
application of pesticides is thought to be one of the most significant factors
affecting biodiversity. Jonsson et al. report that the continued decline of the
Swedish partridge population is linked to changes in land use and the use of
chemical weed control. Chemical weed control has the effect of reducing
habitat, decreasing the number of weed species, and of shifting the balance
Environmental Monitoring 87
of species in the plant community. Swedish studies also show the impact of
pesticides on soil fertility, including inhibition of nitrification with
concomitant reduced uptake of nitrogen by plants. These studies also suggest
that pesticides adversely affect soil micro-organisms which are responsible
for microbial degradation of plant matter (and of some pesticides), and for
soil structure.
Process of Metabolism
Metabolism of pesticides in animals is an important mechanism by which
organisms protect themselves from the toxic effects of xenobiotics (foreign
chemicals) in their food supply. In the organism, the chemical is transformed
into a less toxic form and either excreted or stored in the organism. Different
organs, especially the liver, may be involved, depending on the chemical.
Enzymes play an important role in the metabolic process and the presence of
88 Environmental Monitoring
REGIONAL EXAMPLES OF
ECOLOGICAL EFFECTS
In Europe, the European Environment Agency cites a study by Galas et al.
that closely links toxicity of Po River water to the Zooplankton daphnia
magna, to run-off of agricultural pesticides. In the Great Lakes of North
America bioaccumulation and magnification of chlorinated compounds in
what is, on global standards, a relatively clean aquatic system, caused the
disappearance of top predators such as eagle and mink and deformities in
several species of aquatic birds.
The World Wide Fund for Nature reports that a significant amount of an
estimated 190 000 tons of agricultural pesticides plus additional loadings of
non-agricultural pesticides that are released by riparian countries bordering
the North Sea, eventually are transported into the North Sea by a combination
of riverine, groundwater, and atmospheric processes. WWF further reports
that the increased rate of disease, deformities and tumours in commercial
fish species in highly polluted areas of the North Sea and coastal waters of
the United Kingdom since the 1970s is consistent with effects known to be
caused by exposure to pesticides.
ND (not detectible) values, therefore, are not evidence that the chemical
is not present in concentrations that may be injurious to aquatic life and to
human health. That this analytical problem existed in the United States
suggests that the problem of producing water quality data that can be used
for human health protection from pesticides in developing countries, must
be extremely serious. Additionally, detection limits are only one of many
analytical problems faced by environmental chemists when analysing for
organic contaminants.
Even when one has good analytical values from surface water and/or
sediments, the interpretation of pesticide data is not straight forward. For
example, the persistence of organochlorine pesticides is such that the
detection of, say, DDT may well indicate only that:
• The chemical has been deposited through long range transport from
some other part of the world,
• It is a residual from the days when it was applied in that region.
In North America, for example, DDT is still routinely measured even
though it has not been used for almost two decades. The association of
organochlorine pesticides with sediment means that the ability of a river
basin to cleanse itself of these chemicals is partly a function of the length of
time it requires for fine-grained sediment to be transported through the basin.
Geomorphologists now know that the process of erosion and transport of
silts and clays is greatly complicated by sedimentation within the river system
and that this fine-grained material may take decades to be transported out of
the river basin. For sediment-associated and persistent pesticides that are
still in use in some countries, the presence of the compound in water and/or
sediments results from a combination of current and past use. As such, the
data make it difficult to determine the efficacy of policy decisions such as
restrictive use or bans.
Pesticide monitoring requires highly flexible field and laboratory
programmes that can respond to periods of pesticide application, which can
sample the most appropriate medium (water, sediment, biota), are able to
apply detection levels that have meaning for human health and ecosystem
protection, and which can discriminate between those pesticides which appear
as artifacts of historical use versus those that are in current use. For pesticides
that are highly soluble in water, monitoring must be closely linked to periods
of pesticide use.
In the United States where there have been major studies of the behaviour
of pesticide run-off, the triazines (atrazine and cyanazine) and alachlor
Environmental Monitoring 91
(chlorinated acetamide) are amongst the most widely used herbicides. These
are used mainly in the spring. Studies by Schottler et al. indicate that 55-
80% of the pesticide run-off occurred in the month of June.
The significance for monitoring is that many newer and soluble pesticides
can only be detected shortly after application; therefore, monitoring
programmes that are operated on a monthly or quarterly basis are unlikely to
be able to quantify the presence or determine the significance of pesticides
in surface waters.
Pesticides that have limited application are even less likely to be detected
in surface waters. The danger lies in the presumption by authorities that ND
(non-detectable) values implies that pesticides are absent. It may well only
mean that monitoring programmes failed to collect data at the appropriate
times or analysed the wrong media.
Pesticide Registration
Pesticide control is mainly carried out by a system of national
registration which limits the manufacture and/or sale of pesticide products
to those that have been approved. In developed countries, registration is
a formal process whereby pesticides are examined, in particular, for
mammalian toxicity (cancers, teratogenic and mutagenic effects, etc.) and
for a range of potential environmental effects based on the measured or
estimated environmental behaviour of the product based on its physico-
chemical properties. Most developing countries have limited capability
to carry out their own tests on pesticides and tend to adopt regulatory
criteria from the developed world. As our knowledge of the effects of
pesticides in the environment accumulates, it has become apparent that
many of the older pesticides have inadequate registration criteria and are
being re-evaluated.
As a consequence, the environmental effects of many of the older pesticides
are now recognized as so serious that they are banned from production or
sale in many countries. A dilemma in many developing countries is that many
older pesticides (e.g. DDT) are cheap and effective. Moreover, regulations
are often not enforced with the result that many pesticides that are, in fact,
banned, are openly sold and used in agricultural practice. The dichotomy
between actual pesticide use and official policy on pesticide use is, in many
countries, far apart. Regulatory control in many countries is ineffective without
a variety of other measures, such as education, incentives, etc.
The extent to which these are effective in developed versus developing
countries depends very much on:
• The ability of government to effectively regulate and levy taxes and
• On the ability or readiness of the farming community to understand
and act upon educational programmes.
Environmental Monitoring 93
DANISH EXAMPLE
In 1986 the Danish Government initiated an Action Plan for sustainable
agriculture which would prevent the use of pesticides for two purposes:
• Safeguard Human Health: From the risks and adverse effects
associated with the use of pesticides, primarily by preventing intake
via food and drinking water.
• Protect the Environment: Both the non-target and beneficial organisms
found in the flora and fauna on cultivated land and in aquatic
environments.
The objective was to achieve a 50% reduction in the use of agricultural
pesticides by 1997 from the average amount of pesticides used during the
period 1981-85.
This was to be measured by:
• A decline in total sales (by weight) of the active ingredients and,
• Decrease in frequency of application.
While the World Wide Fund for Nature report that by 1993, sales of active
ingredients had been reduced by 30%, the application frequency had not
declined. The Danish legislation included the following components although,
by 1993, not all had achieved comparable success. Beginning with the use of
DDT during the Second World War (whose discovery earned entomologist
Dr. Paul Mueller a Nobel Prize in 1948), the pesticide industry has grown
94 Environmental Monitoring
Pesticides are also found far afield, in ecosystems considered pristine and
far from active pesticide use. Osprey eggs in the Queen Charlotte Islands,
polar bear fat in the high arctic, and the blubber of whales in all the oceans of
the world are contaminated with pesticide residues, 7 even though all these
creatures live far from point sources of pesticide application. Water and wind,
as well as the bodies of animals that serve as prey for others (including
humans) higher on the food chain, are the universal vectors for pesticide
dispersal. Highest on the food chain, human breast milk is of great concern
because of high levels of bio-accumulated pesticides. Breast milk of Inuit
women contains much higher pesticide levels than the milk of women in
southern Canada, raising concerns about this most intimate and crucial form
of human sustenance.
Two other factors make pesticides problematic for human and ecosystem
health. First, many pesticides are not persistent in human or other biological
systems. Therefore they may be difficult to measure in tissue or other samples
collected more than a few hours after exposure, although their biological
effects may persist for days, months or even years. Second, many pesticides
undoubtedly have additive or synergistic effects with one another, especially
when they belong to the same chemical class. Only recently have these two
issues been acknowledged by legislators, with the 1996 US Food Quality
and Protection Act being the first major enactment anywhere in the world
that takes the latter fact into consideration.
A further health issue regarding pesticides has emerged in the last decade.
This is the demonstration that many chemical compounds, among them many
pesticides, have hormone-like effects in biological systems, effects that were
previously unsuspected as occurring on such a wide scale. During the last
four years, the US Environmental Protection Agency (EPA) has been
designing a first-ever programme for analysing these effects.
The work has proceeded so slowly, however, despite a legislative mandate
to act speedily, that the EPA itself is now being sued for dragging its feet!10
The Canadian government, lacking legislation for examining any adverse
effects of pesticides, relies on manufacturers to supply such evidence.
STATEMENT
Reaching the goal of pesticide elimination cannot be accomplished without a
dramatically increased support programme for farmers and other growers who are
prepared to convert to sustainable growing practices, including cessation of pesticide
use.
Environmental Monitoring 97
We believe that the best means to accomplish the goal of eliminating routine
pesticide use is as follows:
• Through an immediate and substantive increase in funding and
practical support for research and information dissemination
concerning alternative, nontoxic methods of pest control, coupled with
strong market incentives for non-chemical lawn and garden care
contractors and product suppliers.
• Through the development of new and imaginative legislative
initiatives and clear-cut and substantive market incentives (including
tax shifting) to support and encourage the rapid expansion of organic
growing practices in all parts of the country; at all levels of
government. This must include an essentially cost-free, uniform,
nationwide certification process for new and already established
organic growing operations.
• Through the Federal government and its regulators immediately moving
towards a legislated end to cosmetic pesticide use within two years, as
recommended by the House of Commons Standing Committee on
Environment and Sustainable Development. (Cosmetic uses encompass
lawn and decorative garden management, and the noncommercial growing
of food crops.)
• Through the Federal government legislating, for the Pesticide
Management Regulatory Agency, an increasingly restrictive regulatory
framework governing the use of synthetic pesticides. This would begin
with the most toxic substances, but ultimately include all synthetic
chemical pesticides and ‘inerts’ unless needed for critical, short-term,
emergency situations.
Three initial steps in this direction must include:
– The immediate elimination of the most toxic pesticides, as
determined by an independent scientific panel;
– The rapid introduction of full disclosure of ALL ingredients in
pesticide formulations; and
– The establishment of an independent office for the collection
and public disclosure of all reports of proven or possible adverse
effects resulting from pesticide exposures.
• Through all government pesticide regulation reflecting the
following four essential elements:
– The precautionary principle (do not act without reasonable proof
ofharmlessness);
– The principle of reverse onus (the producer bears responsibility
for safety);
98 Environmental Monitoring
Radioactive substances are those which have the ability to emit high energy
particles, like alpha and beta particles and gamma rays. They are unstable in
nature and are continuously emitting these particles in order to gain some
stability. When we are talking about the effects of radioactive pollution, it
actually means the effects of these emissions on the environment and living
beings of the earth.
plants’ DNA and affect its normal functioning. Some of the plants may die
after such exposure while others may develop weak seeds. When any part of
the contaminated plant, including the fruits are consumed by human beings,
then it causes serious health risks. Radioactive emissions from nuclear
weapons are considered as the most harmful for the environment, as they
stay in the atmosphere for as long as a hundred years. Thus, it affects several
generations. Similarly, the radioactive substances from the land surface that
flows down to the water bodies remain there for years to come. It causes
harm to the aquatic animals. Thus, we can say that radioactive pollution has
a destructive effect on the entire ecosystem.
RADIOACTIVE POLLUTION
Radioactive pollution can be defined as the release of radioactive
substances or high-energy particles into the air, water, or earth as a result of
human activity, either by accident or by design.
The sources of such waste include:
• Nuclear weapon testing or detonation;
• The nuclear fuel cycle, including the mining, separation, and
production of nuclear materials for use in nuclear power plants or
nuclear bombs;
• Accidental release of radioactive material from nuclear power plants.
Sometimes natural sources of radioactivity, such as radon gas emitted
from beneath the ground, are considered pollutants when they become
a threat to human health.
Since even a small amount of radiation exposure can have serious (and
cumulative) biological consequences, and since many radioactive wastes
remain toxic for centuries, radioactive pollution is a serious environmental
concern even though natural sources of radioactivity far exceed artificial
ones at present. The problem of radioactive pollution is compounded by the
difficulty in assessing its effects.
Radioactive waste may spread over a broad area quite rapidly and
irregularly from an abandoned dump into an aquifer and may not fully show
its effects upon humans and organisms for decades in the form of cancer or
other chronic diseases. Surface waters are a powerful factor that causes
migration of radionuclides across the territory of Belarus. For this reason it
is essential to take into due account the transit role of rivers in the
transportation of radionuclides, including transboun-dary transfer. In
watercourses and flowing water bodies concentration of radionuclides are
reducing every year, but they tend to accumulate in sttatic water bodies (lakes,
ponds, reservoirs, especially in bottom sediments).
102 Environmental Monitoring
Due to the accident at the Chernobyl nuclear power station (CNPS) one
quarter of the country’s territory was contaminated by caesium-137 (23%),
strontium-90 (10%) and plutonium (about 2 %). Concentrations of caesium-
137 early 90’s presented. The monitoring data the radiation situation in the
Dnieper-Sozh and Pripyat basins is stable. The annual average concentration
of caesium-137 has decreased significantly in large and small rivers for the
observed early 1990’s period. Exceeding of the National permissible levels
for caesium-137 and strontium-90 were not observed. Caesium-137
concentration in surface waters are closely connected with the annual volume
of river flow, as can be seen from the increase in concentrations of caesium-
137 in some rivers, where water supply was lower than the perennial average
values.
The data analysis of concentration of caesium-137 during the spring flood
in the Pripyat basin shows that concentration of caesium-137 in a dissolved
form in the Pripyat basin remains at the level of the average indices for the
prior period. Yet concentration of this radionuclide has considerably increased
in dredges. This means that caesium-137 is washed out and transported by
flood flow with sediments. Since the radiation situation has stabilized,
transboundary transport of radioactive elements through river flow has
significantly decreased. Mainly it is the Pripyat that transports radionuclides,
in particular strontium-90, as they are washed out from the 30-kilometre
Chernobyl zone. Due attention is devoted to studies of the radiation state of
small rivers, which are tributaries of the Pripyat and the Sozh in the most
contaminated areas of the Gomel and Mogilev regions. Over the years
radioactivity in water tends to decrease.
The exception is the dissolved strontium -90, which is a specific feature
for the CNPS zone. Annual data on the content of caesium-137 and strontium-
90 (in soluble and suspended forms) in the bottom sediments and water biota
suggest that bottom sediments and water biota are significant contributors to
the total radioactivity of surface water systems. A tendency towards a reduction
in radioactivity of bottom sediments and water biota is minor. During spring
high water and summer-autumn flood, migration of radionuclides into open
water systems occurs both in soluble and in absorbed forms on organic and
mineral carrieres. The ratio between concentrations of caesium-137 and
strontium-90 at the end checkpoints of the Braginka and Senna Rivers suggests
that starting from1992-1993, the concentration of strontium-90 has begun to
exceed that of caesium-137. That phenomenon is characteristic for surface
watercourses close to the CNPS zone and is explained by an increase in
Environmental Monitoring 103
NUCLEAR FACILITIES
The Nuclear Facilities Unit of the RPMWS develops and implements the
DEQ’s Nuclear Facilities Emergency Response Procedures and the nuclear
accident aspects of the Michigan Emergency Management Plan as they relate
to the DEQ’s responsibilities to respond to accidents or emergencies at any
of Michigan’s commercial nuclear power plants or to large-scale radiological
incidents. These efforts are conducted in cooperation with other state agencies
and under the overall emergency response coordination of the Michigan State
Police. Unit staff also interact with nuclear plant utility staff and staff of the
NRC concerning the day-to-day operations of nuclear power reactors to assure
radiological protection of the public and the environment.
RISK EVALUATION
The joint projects between Norway and Russia include risk analysis of
potential scenes of accidents at several nuclear plants in Russia. Substantial
amounts of radioactive waste was released from nuclear plants and directed
out in rivers and ocean, causing severe pollution of the environment. Today,
storing of radioactive waste at plants with poor securing is a big problem.
There is great concern about the consequences if accidents take place, what
the effects might be on the local environment, and even for other countries.
Calculations and risk evaluations have been made in case of potential
106 Environmental Monitoring
Types of Radiation
Radiation is classified as being ionizing or nonionizing. Both types can
be harmful to humans and other organisms. Radioactive Pollution-
Nonionizing Radiation Radioactive Pollution - Ionizing Radiation
Radioactive Pollution - Lifestyle and Radiation Dose Radioactive Pollution
- Nuclear Weapons Testing Radioactive Pollution - Nuclear Power Plants
Radioactive Pollution - Biological Effects of Radioactivity
NONIONIZING RADIATION
Nonionizing radiation is relatively long-wavelength electro-magnetic
radiation, such as radio waves, microwaves, visible radiation, ultraviolet
radiation, and very low-energy electromagnetic fields. Nonionizing radiation
is generally considered less dangerous than ionizing radiation. However, some
Environmental Monitoring 107
Ionizing Radiation
Ionizing radiation is the short wavelength radiation or particulate radiation
emitted by certain unstable isotopes during radioactive decay. There are about
70 radioactive isotopes, all of which emit some form of ionizing radiation as
they decay from one isotope to another. A radioactive isotope typically decays
through a series of other isotopes until it reaches a stable one. As indicated
by its name, ionizing radiation can ionize the atoms or molecules with which
it interacts.
In other words, ionizing radiation can cause other atoms to release their
electrons. These free electrons can damage many biochemicals, such as
proteins, lipids, and nucleic acids (including DNA). In intense, this damage
can cause severe human health problems, including cancers, and even death.
Ionizing radiation can be either short-wavelength electromagnetic radiation
or particulate radiation.
Gamma radiation and X-radiation are short-wavelength electromagnetic
radiation. Alpha particles, beta particles, neutrons, and protons are particulate
radiation. Alpha particles, beta particles, and gamma rays are the most
commonly encountered forms of radioactive pollution. Alpha particles are
simply ionized helium nuclei, and consist of two protons and two neutrons.
Beta particles are electrons, which have a negative charge. Gamma radiation
is high-energy electromagnetic radiation. Scientists have devised various units
for measuring radioactivity.
A Curie (Ci) represents the rate of radioactive decay. One Curie is 3.7 ×
10
10 radioactive disintegrations per second. A rad is a unit representing the
absorbed dose of radioactivity. One rad is equal to an absorbed energy dose
of 100 ergs per gram of radiated medium. One rad = 0.01 Grays. A rem is a
unit that measures the effectiveness of radioactivity in causing biological
damage. One rem is equal to one rad times a biological weighting factor. The
weighting factor is 1.0 for gamma radiation and beta particles, and it is 20
for alpha particles. One rem = 1000 millirem = 0.01 Sieverts. The radioactive
half-life is a measure of the persistence of radioactive material. The half-life
108 Environmental Monitoring
problem. This waste will remain radioactive for many thousands of years, so
technologists must design systems for extremely long-term storage. One
obvious problem is that the long-term reliability of the storage systems cannot
be fully assured, because they cannot be directly tested for the length of time
they will be used (i.e., for thousands of years). Another problem with nuclear
waste is that it will remain extremely dangerous for much longer than the
expected lifetimes of existing governments and social institutions. Thus, we
are making the societies of the following millennia, however they may be
structured, responsible for the safe storage of nuclear waste that is being
generated in such large quantities today.
RADIOACTIVE
Atomic nuclei that are not stable, tend to approach stable configuration(s),
by the process of radioactivity. Atoms are radioactive because the ratio of
neutrons to protons is not ideal. Through radioactive decay, the nucleus
approaches a more stable neutron to proton ratio. Radioactive decay releases
different types of energetic emissions. The three most common types of
radioactive emissions are alpha particles, beta particles, and gamma rays.
Fission also is a form of radioactive decay. Alpha (a) decay occurs when the
neutron to proton ratio is too low. Alpha decay emits an alpha particle, which
consists of two protons and two neutrons. This is the same as a helium nucleus
Environmental Monitoring 113
and often uses the same chemical symbol 4He2. Alpha particles are highly
ionizing (e.g. deposits energy over a short distance).
Since alpha particles lose energy over a short distance, they cannot travel
far in most media. For example, the range of a 5 MeV alpha particle in air is
only 3.5 cm. Consequently, alpha particles will not normally penetrate the
outermost layer of the skin. Therefore, alpha particles pose little external
radiation field hazard. Shielding of alpha particles is easily accomplished
with minimal amounts of shielding. Examples of alpha particle emitting radio-
nuclides include 238U, 239Pu, and 241Am.
238U → 234Th + 4He .
92 90 2
239Pu → 235U + 4He .
94 92 2
241Am → 237Np + 4He .
95 93 2
After the emission of an α particle, the daughter product remaining, will
be reduced by 4 in its mass number, and 2 in its atomic number, as could be
verified in the examples above.
Beta (β-) decay occurs when the neutron to proton ratio is too high. The
radioactive nucleus emits a beta particle, which is essentially an electron, in
order to bring this to a more favourable ratio. Beta particles are less ionizing
than alpha particles. The range of beta particles depends on the energy, and
some have enough to be of concern regarding external exposure. A 1 MeV
beta particle can travel approximately 12 feet in air. Energetic beta particles
can penetrate into the body and deposit dose to internal structures near the
surface. Since beta particles are less ionizing than alpha particles, greater
shielding is required. Low Z materials are selected as beta particle shields to
take care of X-ray emissions associated with slowing down of beta particles
while they travel in a medium. In b emission, the neutron to proton ratio is
reduced by converting a neutron into proton as:
1n → 1p + e–.
0 1
114 Environmental Monitoring
Emission of γ-rays doesnt change the Mass number nor Atomic number.
If an atom is in exited state it comes to stable state by emitting a γ radiation.
Environmental Monitoring 115
dps. The other unit prevalent for the activity, is a Curie (Ci). 1 Ci = 3.71010.
These units do not distinguish between alpha, beta or even gamma. These
units provide an understanding of the “strength” of the radioactive sample
but do not account for any of the properties of the radiation emitted. To
describe the degree of hazard to people from a particular radiation requires
other units.
Energy Levels: As mentioned earlier, quantum mechanics is needed to
understand and quantify atomic and subatomic features. The Quantum Theory
recognizes restrictions in the levels of energy acquired by a system. The
electrons in their orbits, or the nucleons in the shells are filled following
such restrictions. The energy of an electron depends on its orbit, and only
certain orbits are permitted by nature.
Similarly the nucleons within a nucleus occupy different energy states
that are permitted. We understand from this that an electron or a nucleon
cannot go to any arbitrary energy level. Thus when the system is specified,
the allowed energy states get specified. These are known as discrete energy
levels. Transition from one such level to the next lower level will involve
release of energy that is exactly the difference between these two levels, and
cannot be a fraction of the same. The energy is said to get quantised. The
electron orbits permitted for an atom are characteristic of that (species of)
atom, and when an electron jumps from one orbit to the lower one, X-ray,
called characteristic X-ray, with energy equal to the difference between the
energies associated with the two orbits emerges.
This helps even identifying the atom from which the X-rays ejected. Similar
nature applies to nuclei too. Though there are no orbits inside a nucleus for the
nucleons to move around, they have their energy states. A given radioactive
gamma-emitting nucleus will eject gamma rays (called gamma quanta)
characteristic of the nucleus. The energies taken by the nucleons within an
unexcited nucleus are called bound levels. Similarly, a nucleus could be raised
(excited) in its internal energy only to certain permitted levels. These levels are
called excited levels.
This varies from nucleus to nucleus, but is fixed for a given nucleus. When
the nucleus is unexcited, it is said to be in its ground state. The separation
between two level energies decreases as the energy increases. When a nucleus
is excited to a particular level, it de-excites to reach the ground state by emitting
a neutron, gamma quanta, or any other particle. Fission also is such a process.
The de-excitation could be in a single step, or in multiple steps involving a
series of particle emissions or gamma or both.
118 Environmental Monitoring
Fig. Synthesis of mechanisms from three ecological fields (A-C) to identify natural processes
that promote biodiversity, ecosystem functioning, and ecosystem stability (D): Mechanisms in
bold on the left can be combined and described as stabilizing species interactions because all
of these mechanisms result from negative frequency-dependent natural processes.
The natural processes that are predicted to locally promote biodiversity,
ecosystem stability, and ecosystem functioning have commonly been
considered separately, but are quite congruent. Theoretical and empirical
studies have identified mechanisms that can promote biodiversity, ecosystem
stability, and ecosystem functioning. Interestingly, stabilizing species
interactions, which cause a species to limit itself more than it limits other
species, are predicted to promote biodiversity, ecosystem stability, and
ecosystem functioning. Previous studies have found that stabilizing species
interactions can promote biodiversity, ecosystem stability, and ecosystem
functioning.
Stabilizing species interactions occur when interspecific interactions (i.e.,
between individuals from different species) are more favourable than
intraspecific interactions (i.e., between individuals of the same species). This
results in a rare species advantage, common species disadvantage, or both.
Species interactions are stabilizing when interspecific resource competition
124 Environmental Monitoring
Fig. A hypothesis tree that can be used to tease apart the relative
importance of various types of stabilizing species interactions
Environmental Monitoring 125
change in the difference between the amount of radiation entering and exiting
Earth’s atmosphere) to determine the causes and consequences of climate
change (IPCC 2007). Radiative forcing is central to this discussion because
it is influenced by both natural and anthropogenic processes and it influences
many climate variables. Future ecological studies could take a similar
approach to determine the causes and consequences of changes in biodiversity.
Stabilizing species interactions are central to this discussion because they
can be influenced by both natural and anthropogenic processes, and they can
influence both biodiversity and ecosystem properties.
BIODIVERSITY LOSS
Loss of biodiversity or biodiversity loss is the ongoing extinction of species
worldwide, and also the local reduction or loss of species in a certain habitat
or ecological niche or biome. The latter phenomenon can be temporary or
permanent, depending on whether the environmental degradation that leads
to the loss is reversible through ecological restoration/ ecological resilience
or effectively permanent (e.g. through land loss). Global extinction has so
far been proven to be irreversible.
Even though permanent global species loss is a more dramatic phenomenon
than regional changes in species composition, even minor changes from a
healthy stable state can have dramatic influence on the food web and the
food chain insofar as reductions in only one species can adversely affect the
entire chain (coextinction), leading to an overall reduction in biodiversity,
possible alternative stable states of an ecosystem notwithstanding. Ecological
effects of biodiversity are usually counteracted by its loss. Reduced
biodiversity in particular leads to reduced ecosystem services and eventually
poses an immediate danger for food security, also for humankind.
LOSS RATE
The current rate of global diversity loss is estimated to be a 1000 times
higher than the (naturally occurring) background extinction rate and expected
to still grow in the upcoming years. Locally bounded loss rates can be measured
using species richness and its variation over time. Raw counts may not be as
ecologically relevant as relative or absolute abundances. Taking into account
the relative frequencies, a considerable number of biodiversity indexes has
been developed. Besides richness, evenness and heterogeneity are considered
to be the main dimensions along which diversity can be measured. As with all
diversity measures, it is essential to accurately classify the spatial and temporal
Environmental Monitoring 127
FACTORS
Major factors for biotic stress and the ensuing accelerating loss rate are,
amongst other threats:
1. Habitat loss and degradation
Land use intensification (and ensuing land loss/habitat loss) has been
identified to be a significant factor in loss of ecological services due
to direct effects as well as biodiversity loss.
2. Climate change through heat stress and drought stress
3. Excessive nutrient load and other forms of pollution
4. Over-exploitation and unsustainable use (e.g. unsustainable fishing
methods) we are currently using 25% more natural resources than the
planet
5. Invasive alien species that effectively compete for a niche, replacing
indigenous species.
INSECT LOSS
In 2017, various publications describe the dramatic reduction in absolute
insect biomass and number of species in Germany and North America over a
period of 27 years. As possible reasons for the decline, the authors highlight
neo-nicotinoids and other agrochemicals. Writing in the journal PLOS One,
authors Hallman, Sorg, et al (2017), conclude that “the widespread insect
biomass decline is alarming.”
and acid rain, climate alterations, and the introduction of exotic species, all
coincident to human population growth. For rainforests, the primary factor
is land conversion. Climate will probably change least in tropical regions,
and nitrogen problems are not as important because growth in rainforests is
usually limited more by low phosphorus levels than by nitrogen insufficiency.
The introduction of exotic species is also less of a problem than in temperate
areas because there is so much diversity in tropical forests that newcomers
have difficulty becoming established.
HUMAN POPULATION GROWTH
The geometric rise in human population levels during the twentieth century
is the fundamental cause of the loss of biodiversity. It exacerbates every
other factor having an impact on rainforests (not to mention other ecosystems).
It has led to an unceasing search for more arable land for food production
and livestock grazing, and for wood for fuel, construction, and energy.
Previously undisturbed areas (which may or may not be suitable for the
purposes to which they are constrained) are being transformed into agricultural
or pasture land, stripped of wood, or mined for resources to support the energy
needs of an ever-growing human population. Humans also tend to settle in
areas of high biodiversity, which often have relatively rich soils and other
attractions for human activities. This leads to great threats to biodiversity,
especially since many of these areas have numerous endemic species.
Balmford, et al., (2001) have demonstrated that human population size in a
given tropical area correlates with the number of endangered species, and
that this pattern holds for every taxonomic group. Most of the other effects
mentioned below are either consequent to the human population expansion
or related to it.
The human population was approximately 600,000 million in 1700, and
one billion in 1800. Just now it exceeds six billion, and low estimates are
that it may reach 10 billion by the mid-21st century and 12 billion by 2100.
The question is whether many ecological aspects of biological systems can
be sustained under the pressure of such numbers. Can birds continue to
migrate, can larger organisms have space (habitat) to forage, can ecosystems
survive in anything like their present form, or are they doomed to
impoverishment and degradation?
HABITAT DESTRUCTION
Habitat destruction is the single most important cause of the loss of
rainforest biodiversity and is directly related to human population growth.
Environmental Monitoring 129
POLLUTION
Industrial, agricultural and waste-based pollutants can have catastrophic
effects on many species. Those species which are more tolerant of pollution
will survive; those requiring pristine environments (water, air, food) will
not. Thus, pollution can act as a selective agent. Pollution of water in lakes
and rivers has degraded waters so that many freshwater ecosystems are dying.
Since almost 12% of animals species live in these ecosystems, and most
others depend on them to some degree, this is a very serious matter. In
developing countries approximately 90% of wastewater is discharged,
untreated, directly into waterways.
AGRICULTURE
The dramatic increase in the number of humans during the twentieth
century has instigated a concomitant growth in agriculture, and has led to
conversion of wildlands to croplands, massive diversions of water from lakes,
rivers and underground aquifers, and, at the same time, has polluted water
and land resources with pesticides, fertilizers, and animal wastes. The result
has been the destruction, disturbance or disabling of terrestrial ecosystems,
and polluted, oxygen-depleted and atrophied water resources. Formerly,
agriculture in different regions of the world was relatively independent and
local. Now, however, much of it has become part of the global exchange
economy and has caused significant changes in social organization.
Earlier agricultural systems were integrated with and co-evolved with
technologies, beliefs, myths and traditions as part of an integrated social
system. Generally, people planted a variety of crops in different areas, in the
hope of obtaining a reasonably stable food supply. These systems could only
be maintained at low population levels, and were relatively non-destructive
130 Environmental Monitoring
(but not always). More recently, agriculture has in many places lost its local
character, and has become incorporated into the global economy. This has
led to increased pressure on agricultural land for exchange commodities and
export goods. More land is being diverted from local food production to
“cash crops” for export and exchange; fewer types of crops are raised, and
each crop is raised in much greater quantities than before. Thus, ever more
land is converted from forest (and other natural systems) for agriculture for
export, rather than using land for subsistence crops.
The introduction of monocropping and the use of relatively few plants for
food and other uses – at the expense of the wide variety of plants and animals
utilized by earlier peoples and indigenous peoples – is responsible for a loss
of diversity and genetic variability. The native plants and animals adapted to
the local conditions are now being replaced with “foreign” (or “exotic”)
species which require special inputs of food and nutrients, large quantities of
water. Such exotic species frequently drive out native species. There is
pressure to conform to crop selection and agricultural techniques – all is
driven by global markets and technologies.
GLOBAL WARMING
There is recent evidence that climate changes are having effects on tropical
forest ecology. Warming in general (as distinct from the effects of increasing
concentrations of CO2 and other greenhouse gases) can increase primary
productivity, yielding new plant biomass, increased organic litter, and
increased food supplies for animals and soil flora (decomposers). Temperature
changes can also alter the water cycle and the availability of nitrogen and
other nutrients. Basically, the temperature variations which are now occurring
affect all parts of forest ecosystems, some more than others. These interactions
are unimaginably complex. While warming may at first increase net primary
productivity (NPP), in the longer run, because plant biomass is increasing,
more nitrogen is taken up from the soil and sequestered in the plant bodies.
This leaves less nitrogen for the growth of additional plants, so the increase
in NPP over time (due to a rise in temperature or CO2 levels) will be limited
by nitrogen availability. The same is probably true of other mineral nutrients.
The consequences of warming-induced shifts in the distribution of nutrients
will not be seen rapidly, but perhaps only over many years. These events
may effect changes in species distribution and other ecosystem processes in
complex ways. We know little about the reactions of tropical forests, but
they may differ from those of temperate forests.
Environmental Monitoring 131