- Managing Irrigation and Nitrogen Fertilization to Maximize Productivity and Protect Environmental Quality
- Nitrogen Fertilizer Movement in Turf
- Urban Water Conservation BMP 5 Requires Irrigation Management Using ETo-based Water Use Budgets
- Non-Potable Water Recycling Criteria Priority Issues Ranked at Workshop
- Microbial Considerations in Wastewater Reclamation and Reuse
Managing Irrigation and Nitrogen Fertilization to Maximize Productivity and Protect Environmental Quality
The scientific community must provide quantitative results that guide farmers in making agricultural management decisions that optimize the dual goals of high crop yield and low environmental degradation.
Application of nitrogen (N) fertilizer usually increases yield but can also contribute to polluting surface and ground waters via runoff and leaching because nitrate (NO3-) is not bound to soil colloids. The goal of N fertilizer management is to keep N in the root zone -- to make it available to the target crop and to reduce off-site pollution -- which can be accomplished with efficient irrigation management.
Irrigation management options include the type of irrigation system, irrigation timing, and the amount of water applied. Nitrogen management options include the timing of N fertilization, the methods of fertilizer application, and the chemical form of N used. Field experiments that include all combinations of N and irrigation management variables are prohibitively expensive. An alternative is to use simulation models that integrate irrigation, N management, soil, and climate information to predict expected crop growth and NO3- leaching beyond the root zone.
In a recent UCR study, the CERES-Maize Model (Version 2.10, Jones & Kiniry, 1986) was used to simulate a comparison of soil-applied N and fertigation (applying fertilizer to crops by mixing it with irrigation water) on corn yield and NO3- leaching potential beyond the root zone.
Soil applications of fertilizer during the growing season are constrained by the difficulty of moving equipment across the field as the crop grows. Soil-applied N fertilizer treatments occur less frequently and at higher rates of application than fertigation. In principle, fertigation allows the timing and quantity of fertilizer application to be programmed to match N uptake dynamics by plants, which means that fertigation might be expected to lead to less groundwater degradation than fertilizer applied to the soil surface.
Mitigating these theoretical benefits, however, is the fact that fertilizer distribution across the field under fertigation depends on the distribution uniformity (DU) of the irrigation water application. According to the Department of Water Resources Bulletin 160-98, irrigation experts estimate that current hardware design and manufacturing technology limit the DU of most systems to 80 percent; whereas, modern fertilizer application machinery allows soil-applied fertilizer to be uniformly distributed across the field.
The three most significant findings of this simulation experiment were the following:
- The difference in yield between fertigation and soil application of N for a given DU in the irrigation system was less than 2%. The difference in the amount of N leached was less than 10 kg ha-1.
- At any of the three N application rates, the simulated yield increased as the DU increased from 75% to 100%. Although 100% uniformity is impractical, the general trend of yield increasing as DU increases is consistent with many other studies.
- Fertigation did not significantly mitigate the effects of nonuniform irrigation on crop yield or N leaching.
This study, conducted by Xueping Pang and John Letey of UCR's Department of Environmental Sciences, has been accepted for publication in the Journal of Agronomy.
UCR scientists who monitored the movement of fertilizer nitrogen (N) below the root system of mature cool-season turf when N was applied at high rates and frequent intervals concluded that mature turf swards act like "sponges" that soak up fertilizer N, leaving little for leaching.
Nitrogen fertilizer sources tested for their nitrate (NO3-) leaching potential were granular urea (46-0-0), sulfur-coated urea (SCU, 37-0-0, 30% dissolution rate), and blood meal (13-0-0). These sources, classified as soluble, slow-release, and natural organic, respectively, represented the possible range of NO3- leaching potential.
"Even at very high nitrogen fertilization rates, there is little probability of significant nitrate leaching from any of the tested sources on a mature turfgrass stand. Only urea fertilization gave levels of nitrate leachate that were above the tap water content, but still below federal guidelines. The slow-release sources, particularly the blood meal, presented the lowest potential for nitrate leaching," said Vic Gibeault, Extension Environmental Horticulturist.
The concentration of NO3- given in parts per million (ppm or mg/l) in collected leachate ranged from a 9.8 ppm (urea) to 3 ppm (SCU) to less than 1 ppm (blood meal). The NO3-content of the tap water used for irrigation ranged from 6.1 to 6.5 ppm NO3-
Fertilizing mature turf swards does not pose a threat to the environment from N contamination when the following irrigation best management practices (BMPs) minimize losses via runoff or volatilization.
- "Water-in" fertilizer immediately after application. NH4+ (ammonium ions) can be volatilized to gaseous ammonia (NH3) and lost to the atmosphere, unless dissolved quickly in water. Gaseous N loss can be minimized to about 1% if fertilizer is watered-in.
- Avoid overirrigation after fertilization. In saturated soil, microbes reduce NO3-to nitrous oxide (N2O) gas and elemental N (N2) gas, both subject to volatilization losses. To protect surface waters from N contamination, avoid runoff after fertilization.
- Use low N rates or slow-release N sources on sands or very leachable soils.
- Apply N fertilizer when NO3- levels are expected to be low, when turf roots can use the nutrient.
Among Gibeault's collaborators were Marylynn Yates, Environmental Microbiologist and Groundwater Quality Specialist, and Jewell Meyer, retired Extension Soil and Water Specialist.
Recent revisions to best management practice 5 (BMP 5), Large Landscape Conservation Programs and Incentives, set the maximum allowable irrigation water applied annually up to 1.0 ETo per square foot of landscape area for accounts with dedicated irrigation meters at commercial, industrial, and institutional (CII) sites.
ETo, known as "reference" evapotranspiration, approximates the water use of a 4- to 6-inch tall, healthy cool-season grass. (Water use by plants consists primarily of two components, soil evaporation [E] and plant transpiration [T], hence the term 'evapotranspiration'.)
Water agencies that signed the Memorandum of Understanding (MOU) Regarding Urban Water Conservation in California, governed by the California Urban Water Conservation Council (CUWCC), before Dec. 31, 1997 were required to implement BMP 5 no later than July 1, 1999.
BMP 5 allows local water agencies to use an adjustment factor that reduces the water budget to an amount less than 100% ETo. Agencies can provide water budgets to CII accounts for informational purposes only or can link them to water pricing strategies. BMP 5 does not require differentiation of plant materials in the water budget process.
At each billing cycle, water agencies must notify CII accounts with dedicated irrigation meters of their actual consumption and its relationship to the water use budget. According to BMP 5, CII landscapes also include multifamily residential sites with dedicated irrigation meters (homeowners' associations).
For CII accounts with mixed-use meters or no meters, BMP 5 requires water agencies to conduct water use surveys (audits) and to offer conservation measures specified in the MOU.
The first reporting period to CUWCC, which is composed of MOU signatories, is July 1, 2001. CUWCC is responsible for monitoring implementaion of BMPs and reporting progress to the State Water Resources Control Board (SWRCB). Nearly 250 water agencies, public interest groups, and environmentalists have signed the MOU since 1991.
The BMP 5 Handbook: A Guide to Implementing Large Landscape Conservation Programs as Specified in Best Management Practice 5 will be available from the CUWCC in September 1999.
Water Savings Projections
Landscaping programs in the Metropolitan Water District of Southern California (MWD) service area are projected to account for 10 to 11% of total water conservation savings, says the MWD's Integrated Water Resources Plan (Report No. 1107, March 1996).
UCR research on turf coefficients by Vic Gibeault, Extension Environmental Horticulture Specialist, and his colleagues can be used with historical or real-time ETo data to facilitate the water budgeting required by BMP 5 and to conserve irrigation water.
Real-time ETo can be obtained from CIMIS (California Irrigation Management Information System) by calling 1-800-92CIMIS or by computer using the website address http://wwwdpla.water.ca.gov/cgi-bin/cimis/cimis/hq/main.pl
Historical ETo, crop coefficients, and other statistics can be obtained from Laosheng Wu's UCR Cooperative Extension website http://esce.ucr.edu/soilwater
The efficacy of water banking for turfgrass (allocating landscape irrigation water on an annualized basis), which accounts for reduced physiological demand for water in the winter and excess demand in the summer, is currently being evaluated by Robert Green, UCR Turfgrass Research Agronomist.
BMP 5 has no excess provision for seasonal carryover (water banking); however, the Otay Water District in San Diego has implemented annualized water budget allocations based on 100% Eto. In Otay, unused water (up to 12 inches) is banked to avoid incurring overuse penalties during hot spells, establishment of new plantings, fertilization procedures, or irrigation system failures resulting in unanticipated usage. Their water efficient landscape irrigation ordinance has been in effect since June 1992.
The statewide Model Water Efficient Landscape Ordinance was added to Title 23 of the California Code of Regulations in response to the 19900 Water Conservation in Landscaping Act (AB 325).
(Appreciation is expressed to Dennis Pittenger, UC Extension Area Environmental Horticulture Specialist, Central Coast and South Region, for reviewing this article.)
What priority issues need to be addressed to establish non-potable water recycling criteria to protect public health using cost-effective technology? A recent workshop sponsored by the National Water Research Institute (NWRI) in cooperation with the Irvine Ranch Water District and the Orange County Water District asked participants to identify and rank issues in response to this question.
The 37 participants represented wastewater utilities, consulting engineering firms, regulatory agencies, and universities. The two-day workshop used the Nominal Group Technique to deliberate significant water issues and reach consensus.
The 97 issues identified were consolidated into 25 major issues. The 10 ranked as highest priorities are listed below:
- Microbial risk assessment methodologies as a tool to help establish water reuse criteria
- Identify reuse criteria that are both protective of public health and enable maximum flexibility and efficient use of treatment technologies
- Understand the pathogen inactivation relationship and performance parameters for various disinfection and treatment processes to develop cost-effective public health protection
- Develop a program to quantify, measure, compare, and communicate relative levels of safety of non-potable reuse to the public and policymakers
- Water quality standards for chemical constituents
- Establish a rational basis for demonstrating equivalent treatment with alternative processes for pathogen removal/inactivation
- Ensure recycled water is microbiologically safe for its intended uses
- Maintain water quality in the reclaimed water storage/distribution system
- Standardize protocols for field testing of equipment and water recycling practices
- Develop new or improved real-time and/or rapid-monitoring strategies to verify treatment and disinfection reliability
Founded in 1991 by a group of Southern California water agencies in partnership with the Joan Irvine Smith - Athalie R. Clarke Foundation, the NWRI creates new water sources via research and technology. The workshop report is NWRI-99-02.
Part III. Assessing Microbial Risks
(Editor's Note: Part I. Types and Occurrence of Microbial Pathogens in Wastewater and Part II. Survival and Fate of Microbial Pathogens on Food and Non-Food Crops Irrigated with Reclaimed Wastewater were published in the Fall 1998 and Winter 1999 issues of Soil Water and Irrigation Management. The series of three articles is adapted from and excerpted in part from a comprehensive chapter authored by Dr. Marylynn V. Yates, Professor of Environmental Microbiology and Extension Groundwater Quality Specialist in the Department of Environmental Sciences at the University of California,
Riverside and Dr. Charles P. Gerba, University of Arizona, Tucson. Their 51-page chapter, "Microbial Considerations in Wastewater Reclamation and Reuse," is published in Wastewater Reclamation and Reuse (1998), edited by Takashi Asano, Ph.D., P.E. (Volume 10 of the Water Quality Management Library), Technomic Publishing Co., Inc, Lancaster, PA.)
The potential for pathogens in reclaimed water to contaminate the underlying groundwater depends on the site's physical characteristics, the hydraulic conditions, the environmental conditions, and the characteristics of the specific pathogens present in the reclaimed water. Recent advances in molecular techniques have enhanced the ability to monitor the transport of a variety of pathogens in the subsurface. Ongoing studies in Southern California and Arizona are using polymerase chain reaction (PCR) and specific nucleic acid probes to determine whether viruses (e.g., hepatitis A, rotaviruses, enteroviruses) are present in groundwaters impacted by artificial recharge with treated wastewaters.
From a public health perspective, the potential risk of pathogenic organisms and carcinogenic chemicals to contaminate groundwater artificially recharged with treated wastewaters must be assessed. The chemicals of greatest concern are the disinfectants (D) used to reduce pathogen concentrations in wastewater and the disinfection by-products (DBPs) formed by the reaction of disinfectants with organic compounds in the water. To develop an optimum strategy for the protection of public health, the relative risks associated with various concentrations of pathogenic microorganisms and D-DBPs in reclaimed water must be quantified.
There are four steps in a formal risk assessment: hazard identification, dose-response determination, exposure assessment, and risk characterization.
The microorganisms of concern when using wastewater to artificially recharge groundwater can be identified using data from waterborne disease outbreaks. At the present time, scientists cannot completely identify the hazard because the causative agents in approximately one-half of the waterborne disease outbreaks in the United States are not identified. In groundwater systems, etiologic (disease-causing) agents were identified in only 38% of the outbreaks, with Shigella species and hepatitis A virus being the most commonly identified pathogens. In over half of the outbreaks, no etiologic agent could be identified, and the illness was simply listed as gastroenteritis of unknown etiology. However, retrospective serological studies of outbreaks of acute nonbacterial gastroenteritis from 1976 through 1980 indicated that 42% of these outbreaks (i.e., the 62% for which no etiologic agent was identified) were caused by the Norwalk virus.
The difficulty in isolating many enteric viruses fromclinical and environmental samples probably accounts for the limited number of viruses identified as causes of waterborne disease. To date, no standardized, routine procedures are available for isolating and identifying hepatitis A and E viruses in environmental (i.e., soil and water) samples. Currently, there is no method for culturing the Norwalk virus in the laboratory. As methods for the detection of enteric viruses and parasites have improved, the percentage of waterborne disease identified as having a viral or parasitic etiology has increased.
In addition to these hazards, the level of endemic microbial disease associated with drinking water must also be identified. In an epidemiological study that examined the contribution of drinking water to endemic gastrointestinal illness, it was found that approximately one-third could be the result of consuming treated drinking water that met all water quality standards and contained no pathogens detectable by current technologies. Recently, it has been recognized that exposure to microbial pathogens in drinking water may also lead to chronic health problems, such as diabetes.
The next step in the process is to determine the effects (response) from exposure to a given dose of the microbes. When determining the dose-response relationship, three different responses to microbial exposure are possible: infection that remains subclinical, infection that results in clinical illness, and infection that leads to illness and subsequent death. Dose-response data on the ability of a microorganism to cause infection are generally obtained by exposing relatively small groups of healthy human volunteers to different doses of the microorganisms of interest; thus, the results represent average or possibly best-case situations. According to the infective dose-response data of several enteric microorganisms listed in Table 1, out of a population of 100 persons exposed to one rotavirus organism, 31 healthy persons would become infected; 1 in 1000 would be infected from exposure to one Shigella microbe; and about 4 in 100,000 would be infected from exposure to one Salmonella typhi microbe.
Certain populations may be more at risk from exposure to the same dose than the healthy volunteers. Very young and very old individuals have a higher risk of severe illness and even death from exposure to pathogens than do other population groups. Individuals with suppressed immune systems may also be more susceptible to infection, illness, and death than healthy individuals.
Another issue that has not been addressed is the effect of exposure to mixtures of contaminants. It is not known whether the risks are additive, synergistic, etc.
Assessing exposure is the most difficult and uncertain aspect of the risk assessment process. Exposure to pathogens in reclaimed water may occur by direct ingestion of the reclaimed water if recharge is by infiltration, or by ingestion of potable groundwater that has been contaminated by the recharge process. Viruses will be of greater concern when exposure is to contaminated groundwater. The number of each of the pathogens ingested must be known to determine the exposure, which means the volume of water consumed and the pathogen concentration must be known. Disease can spread to contacts of the individuals who were exposed directly to the contaminated water, known as secondary exposure or attack. Secondary attack rates of enteric viruses range from 30% (Norwalk virus) to 90% (poliovirus).
Scientists calculate the risk of infection, illness, or death from exposure to various pathogens by using mathematical models and the information generated from the first three steps of the risk assessment process. Input data include the identity and concentration of the pathogen(s) of interest, assumptions about the volume of contaminated water consumed, the source of and treatment received by the reclaimed wastewater, and other factors specific to the particular risk assessment. Risks may be calculated from a single exposure or to multiple exposures over an extended period of time. For calculating microbial risks from drinking water, for example, the Environmental Protection Agency (EPA) calculates annual risk of infection. For this situation, the EPA has established an acceptable risk level of one infection per 10,000 persons per year.
|Microorganism||Probability of Infection from Exposure to One Organism
(Per 10,000 Persons Exposed)
|Vibrio cholera classical||0.07|
|Vibrio cholera El Tor||1.5|
Source: Rose, J. B. and C. P. Gerba. 1991. Use of risk assessment for development of microbial standards. Wat. Sci. Technol. 23:29-34, as cited in Yates and Gerba
Soil Water and Irrigation Management is published quarterly by Dr. Laosheng Wu, Cooperative Extension Irrigation/Water Management Specialist in the Department of Environmental Sciences at the University of California, Riverside. The intent of the publication is to disseminate summaries of research results and topics of interest to UC farm advisors, CE specialists, and UC faculty working in areas related to water management and environmental sciences. The publication is edited by Dr. Wu and Deborah Silva and designed by Jack Van Hise, UCR Printing and Reprographics. Additional copies are available upon request. Please address comments and correspondence to Dr. Wu. Contact Dr. Wu via e-mail (firstname.lastname@example.org ), by telephone (951-827-4664), by fax (951-827-2954), or by mail addressed to him in the Department of Environmental Sciences, University of California, Riverside, Riverside, CA 92521. Soil Water and Irrigation Management is issued in furtherance of Cooperative Extension work, Acts of May 8, 1914 and June 30, 1914, in cooperation with the United States Department of Agriculture, W. R. Gomes, Director, University of California Cooperative Extension. The University of California, in accordance with Federal and State law and University policy, does not discriminate on the basis of race, color, national origin, religion, sex, disability, age, medical condition (cancer-related), ancestry, marital status, citizenship, sexual orientation, or status as a Vietnam-era veteran or special disabled veteran. The University also prohibits sexual harassment. Inquiries regarding the University’s nondiscrimination policies may be directed to the Affirmative Action Director, University of California, Agriculture and Natural Resources, 1111 Franklin Street, Oakland, CA 94607-5200, (510) 987-0096.