Hazard mitigation is any action taken to reduce or eliminate long-term risk to people and property from natural disasters and hazards. This includes measures ranging from building safely within a floodplain, removing homes from high hazard areas, and engineering buildings and other infrastructure to withstand earthquakes, to name only a few examples. Hazard mitigation projects have one key theme in common: they can significantly reduce the impact of disasters and natural hazards on lives and communities
This blog presents Metropolitan Engineering Consulting & Forensics (MEC&F) claim management and claim investigation analyses of some of the typical claims we handle
Wednesday, March 10, 2021
Treatment for Contaminants of Emerging Concern (CECs)
Treatment for Contaminants of Emerging Concern (CECs)
Drinking water treatment plants (DWTPs) are increasingly being challenged by changes in the quality of their source waters and by their aging treatment and distribution system infrastructure. Factors such as shrinking water and financial resources, climate change, agricultural runoff, harmful algal blooms (HABs), and industrial land use increase the probability that CECs (chemicals that have not previously been detected in water, or that are being detected at significantly different levels than expected), such as pesticides, pharmaceuticals, personal care products, endocrine disrupting compounds, and algal toxins will remain after treatment, ending up in consumers’ drinking water. This is likely to disproportionately affect small drinking water systems due to limited resources and treatment options, among other factors. Identifying and quantifying the source water and treatment challenges for water systems is an important step towards mitigating present and future risks.
The following studies will help improve our understanding of the spread of contaminants through drinking water treatment, and identify best approaches for removal.
Removal of cyanobacteria and cyanotoxins through drinking water treatment
EPA is examining the spread and removal of cyanobacterial cells and their associated toxins at DWTPs using samples collected throughout treatment trains at facilities along the shoreline of Lake Erie. This study will improve our understanding of the dynamics of algal toxin release or removal through drinking water treatment processes. Additional details can be found here: Harmful Algal Blooms Cyanobacteria.
- Fact Sheet: Evaluation of Current Water Treatment and Distribution System Optimization to Provide Safe Drinking Water from Various Source Water Types and Conditions
- Webinar recording: Current Water Treatment and Distribution System Optimization for Cyanotoxins
Characterizing the fate of contaminants released from landfills in Alaska
In collaboration with five tribal communities, EPA has completed a study characterizing the fate of contaminants released from landfills in rural Alaska and their potential impact on local drinking water sources. Tribal communities of Allakaket, Eek, Ekwok, White Mountain, Fort Yukon, and other dump sites throughout the State of Alaska are using the study findings to improve the management of their landfills.
Removal of Inorganic and Organic Contaminants
EPA is working to provide information and treatment approaches to small systems to help them manage inorganic contaminants in their water supplies. In addition, the information will assist with revisions to drinking water regulations and can be used by the states for communicating novel and relevant treatment technologies to their systems.
Ammonia Removal Research
Many regions in the U.S. have excessive levels of ammonia in their drinking water source as a result of naturally occurring processes or contamination from agricultural runoff. Ammonia in water does not pose a direct health concern; however, it may pose a concern when nitrification occurs in the drinking water distribution system. Nitrification, is the conversion of the ammonia to nitrite and nitrate by bacteria, and can lead to water quality issues, such as potential pipe corrosion problems, loss of disinfectant, taste and odor complaints, elevated nitrate and nitrite levels, and potential poor water treatment performance. EPA’s research in this area is providing communities with the technologies they need to address these issues.
Highlight: A small community in Iowa relied on individual and neighborhood shallow wells, which became contaminated after a flood in 2008. EPA and the State of Iowa Department of Natural Resources conducted a pilot study using an EPA developed and patented biological water treatment technology for ammonia oxidation. The pilot system, designed, built, and installed by EPA staff, effectively removed ammonia and iron from the community's source water, while keeping nitrite and nitrate levels below their respective maximum contaminant level in the treated water. As part of a federal Housing and Urban Development (HUD) grant, a full-scale water treatment plant based on the pilot system was completed in January 2014. The community now has a functioning public water system that meets all regulations.
Publications:
- Engineering Design and Operation Report – Biological Treatment Process for the Removal of Ammonia from a Small Drinking Water System in Iowa: Pilot to Full-Scale
- Innovative Biological Water Treatment for the Removal of Elevated Ammonia
Arsenic Treatment Technology Demonstrations
EPA is at the forefront of investigating arsenic removal technologies and their cost, including capital and operating costs. From 2002-2012, EPA funded and studied 50 small, full-scale arsenic removal systems in 26 different states impacting over 60,000 consumers. The technologies studied included three that are most commonly utilized by small systems: adsorptive media, iron removal, and coagulation/filtration. The study resulted in the most comprehensive set of performance and cost data ever collected on drinking water treatment for a specific contaminant. The extensive set of data is currently being summarized for informing other communities.
- Arsenic Treatment Technology Demonstrations website
- Webinar recording: Research and Implementation of Arsenic Removal Technologies at Small Community Water Systems
Publications:
- The Costs of Small Drinking Water Systems Removing Arsenic from Groundwater
- Regeneration of Iron-based Adsorptive Media Used for Removing Arsenic from Groundwater
- Arsenic Species in Drinking Water Wells in the USA with High Arsenic Concentrations
Evaluation of cost-effective aeration technologies to address disinfection byproducts (DBPs) compliance
EPA conducted evaluations of cost-effective aeration technology solutions to address DBP compliance, specifically total trihalomethane (TTHM), at a water treatment plant clearwell (storage tank). DBPs form during the drinking water treatment process.
Fluoride Treatment Technologies Research
EPA evaluated full-scale water treatment plants with existing fluoride removal treatment systems and followed with demonstrations of other novel approaches. Systems' operators collected water samples to document performance and to determine the ultimate fate of the contaminants. The results of the studies were used to fill a need for a comprehensive manual on available fluoride removal technologies. The manual supports the fluoride maximum contaminant level (MCL) and provides an in-depth presentation of the steps required to design and operate a fluoride removal plant using activated alumina, which is a reliable and cost-effective process for treating excess fluoride from drinking water supplies. This effort will build confidence in novel technologies and approaches so that communities and state primacy agencies will accept them knowing that they will be successful in removing the contaminants of interest, and without concern about the systems overall sustainability.
Brominated disinfection byproducts (DBPs) studies
To protect public health, public water system (PWS) operators must meet federal limits for disinfection byproducts (DBPs) formed during the water treatment process. A better understanding of the relationship between bromide in source water and DBP formation will help operators in the Ohio River Watershed improve treatment processes, ensure compliance with federal limits, and provide important information to state, local, and federal regulators responsible for protecting the rivers and streams in the watershed.
EPA is evaluating water samples collected at eight PWS in order to investigate relationships between bromide in source water and the formation of brominated DBPs in finished drinking water. Once the results from the collected samples have been finalized, EPA will analyze the data to determine whether correlations exist between bromide in source water and brominated DBP formation. Based on the analysis, EPA will then determine if models can be used to estimate DBP formation, optimize treatment, and inform source control strategies.
Treatment, control, and assessment strategies for lead and copper release
EPA is evaluating water treatment strategies for the control of lead and copper release from drinking water plumbing materials and components. Specifically, scientists are conducting investigations on the impact of water quality on lead and copper release, pipe scale aging on copper release, and complications of metal contamination arising from accumulated deposits of iron, manganese, and aluminum on lead or copper pipe surfaces. EPA is also evaluating the optimization and interaction of treatment processes and the resulting water quality impacts on the nature of mineral scales and deposits in real water systems.
- Webinar recording: orrosion Control for Drinking Water Systems
- Fact sheet: How to Identify Lead Free Certification Marks for Drinking Water System and Plumbing Products
Pathogens and Disinfection
Drinking water can be a source of waterborne illnesses due to the contamination of source waters or treated water as it moves through the distribution system to consumer taps. The growth of pathogens associated with microbial communities, known as biofilms, can occur on drinking water pipe surfaces. Current treatments include the use of disinfectants, such as chlorine and monochloramine; UV treatment; and filtration. To control disinfection byproducts, many utilities have switched from chlorine to monochloramine as both their primary and residual disinfectant. Previous studies have shown the differences between chlorine and monochloramine at killing specific types of pathogens and their ability to penetrate into biofilms; however, there is limited data on their effectiveness at controlling the occurrence and growth of pathogens in distribution systems.
Water quality issues in large buildings and emerging treatment technologies for premise plumbing-related pathogens
The Safe Drinking Water Act (SDWA) sets limits on water quality indicators for water in the distribution system. Once this distributed water enters a building or household, the responsibility for maintaining water quality shifts to the owners. The latest data for waterborne diseases indicates that premise plumbing-related outbreaks are increasing across the Nation. This fact, and the legal ramifications of waterborne outbreaks, are leading hospital and hotel owners to address water quality in their buildings. EPA is investigating water quality issues in large buildings and evaluating emerging treatment technologies to control premise plumbing-related microbial pathogens, with the goal of providing information to building owners on how water quality changes as it moves through complex premise plumbing systems. These investigations include research on both pathogen and corrosion control.
Validation of ultraviolet (UV) disinfection of ground and surface water systems
UV disinfection is an effective process for inactivating many microbial pathogens found in source waters with the potential as stand-alone treatment or in combination with other disinfectants. Small- to medium-sized drinking water systems often have limited resources and expertise to evaluate and install innovative technologies. As a result, they typically do not optimize operations and often apply significantly higher UV doses than necessary. Currently, there is no standard UV testing protocol for viruses under the Ground Water Rule, and there is a lack of recommendations for efficient operation in small systems for virus inactivation applications. EPA is evaluating new approaches for validating UV reactors to meet groundwater and surface water pathogen inactivation goals, including those for virus control for low-pressure and medium-pressure UV systems. The research study is expected to contribute to better public health protection with more accurate UV dose monitoring and reduced capital and operation and maintenance costs. This work will also reduce the burden on states and utilities to prove a novel UV treatment system complies with the EPA’s UV guidance manual.
Publications:
- Reduction of Microbial Contaminants in Drinking Water by UV Light Technology: ETS UV MODEL UVL-200-4
- Reduction of Microbial Contaminants in Drinking Water by UV Light Technology: ETS UV MODEL ECP-113-5
Field studies of low-pressure UV lamps at ground and surface water systems
Some consumers in small rural communities in Puerto Rico face health risks because they rely on unfiltered ground and surface waters for drinking water. This causes periodic outbreaks of waterborne diseases. The remote communities lack economic and technical capabilities in order to comply with drinking water regulations. In addition, traditional water treatment technologies are expensive to operate and maintain.
In collaboration with partners, EPA evaluated UV disinfection systems in two rural communities to determine the effectiveness of low-pressure UV lamps on inactivating pathogens in groundwater and surface water supplies. For both studies, the operation and maintenance costs associated with water delivery, prefiltration, and disinfection systems were compared. The studies benefited from the inclusion of citizen science, which involved training the communities on the capabilities and operation and maintenance of the UV systems. This research study is expected to provide conceptual diagrams of design alternatives for small systems capable of UV inactivation of chlorine resistant pathogens. Research results and lessons learned will be incorporated into a report on the installation, ease of use, and effectiveness of UV disinfection for both surface and groundwater supplies in rural Non-PRASA communities in Puerto Rico and other tropical environments.
Evaluating the effectiveness of disinfectants on microbial communities in water distribution systems
This research will add to our knowledge of how commonly used disinfectants (chlorine and monochloramine) effect microbial communities, known as biofilms, in drinking water distribution system pipes. Using complex molecular tools, a whole metagenome-based approach was used to evaluate the composition and metabolic potential in these communities. Such information is critical to the design of effective management practices and ultimately helps to prevent waterborne disease and safeguard human health.
Publications:
- Metagenomic analyses of drinking water receiving different disinfection treatments
- Establishment and early succession of bacterial communities in monochloramine-treated drinking water biofilms
Filtration alternatives for small communities and households
This study involved case studies on innovative and commercially available drinking water treatment alternatives for small community water systems. Emphasis was placed on media and membrane filtration technologies capable of meeting the requirements of the Long-Term 2 Enhanced Surface Water Treatment Rule (LT2ESWTR) and the Groundwater Rule. Studies included household water treatment systems for removal of chemicals and pathogens from well water.
Models, Tools, and Databases
Drinking Water Treatability Database (TDB)
The Drinking Water Treatability Database (TDB) presents referenced information on the control of contaminants in drinking water. It allows drinking water utilities, first responders to spills or emergencies, treatment process designers, research organizations, academicians, regulators and others to access referenced information gathered from thousands of literature sources and assembled on one site. It includes more than 25 treatment processes used by drinking water utilities. The literature includes bench-, pilot-, and full-scale studies of surface waters, ground waters and laboratory waters. The literature includes peer-reviewed journals and conferences, other conferences and symposia, research reports, theses, and dissertations.
Cost Models for Drinking Water Treatment Plants
The Safe Drinking Water Act Amendments of 1996, as well as a number of other statutes and executive orders, require that EPA estimate regulatory compliance costs as part of its rulemaking process. A new series of cost models have been designed for the purpose of estimating the national costs of drinking water regulations, although they can also be used at the individual site scale. The models were designed to be transparent and versatile.
- Drinking Water Treatment Technology Unit Cost Models
- Using Work Breakdown Structure Models to Develop Unit Treatment Costs
EPANET
EPANET was developed as a tool for understanding the movement and fate of drinking water constituents within distribution systems, and can be used for many different kinds of applications in distribution systems analysis. Today, engineers and consultants use EPANET to design and size new water infrastructure, retrofit existing aging infrastructure, optimize operations of tanks and pumps, reduce energy usage, investigate water quality problems, and prepare for emergencies. EPANET can also be used to model contamination threats and evaluate resilience to security threats or natural disasters. EPANET's user interface provides a visual network editor that simplifies the process of building piping network models and editing their properties and data.
Environmental Technologies Design Option Tool (ETDOT)
The Environmental Technologies Design Option Tool (ETDOT) is a suite of software models that provides engineers with the capability to evaluate and design systems that use granular activated carbon or ion exchange resins for the removal of contaminants, including PFAS, from drinking water and wastewater.
Breakpoint Chlorination Simulator
The Breakpoint Chlorination Simulator for Drinking Water Systems is a web-based application relevant to drinking water practice was developed to assist water utilities in generating chlorine breakpoint curves. The simulator generates two side-by-side breakpoint curves for comparison purposes with user defined conditions.
Chloramine Formation and Decay Simulator
The Chloramine Formation and Decay Simulator for Drinking Water Systems is a web-based application relevant to drinking water practice was developed to simulate inorganic chloramine formation and subsequent stability, including a simple inorganic chloramine demand reaction for organic matter. It provides two side-by-side simulations and associated graphs to allow comparison of input choices on chloramine formation and decay.
Free Chlorine and Cyanuric Acid Simulator
The Chlorine and Cyanuric Acid System Simulator is an application that simulates water chemistry associated with the free chlorine and cyanuric acid system (i.e., chlorinated cyanurates) at user selected conditions. This allows users to estimate the free chlorine concentration when cyanuric acid is present, as in the case when adding chlorine-containing chemicals (commonly referred to as Dichlor or Trichlor) to water.
Drinking Water treatment: a guide to basic water treatment technology
Water treatment and water treatment technologies are an essential line of defense to remove contaminants and bacteria before the delivery of clean, potable water supplies for consumption. Water sources can be subject to contamination and therefore require appropriate treatment to remove disease-causing agents. Public drinking water systems use a variety of methods to provide safe drinking water for their communities. Depending on the continent, country and region, different water treatment systems may be in operation depending on regional regulations and raw water input. The following article provides an overview of the basic principles of water treatment and the processes and technologies involved.
There are over 145,000 active public water systems in the United States (including territories). Of these, 97% are considered small systems under the Safe Drinking Water Act, meaning they serve 10,000 or fewer people.
While many of these active small systems consistently provide safe, reliable drinking water to their customers, many face a number of challenges in their ability to achieve and maintain system sustainability. Some of these small system challenges include lack of expertise to choose, operate, and maintain systems; lack of financial resources; aging infrastructure; limited options for residual disposal; and state agencies with limited resources to support the large number of small systems.
Water treatment: mimicking earth’s hydrological cycle
Maintaining water treatment to ensure a clean supply to meet growing global populations has been an ongoing challenge throughout human history.
Thanks to significant technological developments in water treatment, including monitoring and assessment, high-quality drinking water can be supplied and enjoyed around the world. Replicating the earth’s hydrological cycle in which water is continuously recycled, treatment enables the same water to be cleansed through several natural processes.
To ensure they do not present a health risk, nearly all water sources require treatment before they can be consumed. Many treatment systems are designed to remove microbiological contamination and physical constituents, including suspended solids (turbidity). Following this, a final disinfection stage is nearly always included at the end of the treatment process to help deactivate any remaining microorganisms. If a persistent disinfectant, such as chlorine, is added this can also act as a residual to help prevent biological regrowth during water storage or distribution in larger systems.
Water treatment consists of several stages. This can include the initial pre-treatment by settling or through using coarse media, filtration followed by chlorination, called the multiple barrier principle. The latter allows effective water treatment and allows each stage to treat and prepare water to a suitable quality for the next downstream process. For example, filtration can prepare water to ensure it is suitable of UV (ultraviolet) disinfection.
Depending on the quality and type of the water entering a water plant, treatment may vary. For example, groundwater treatment works abstract water from below ground sources such as aquifers and springs. These sources tend to be relatively clean in comparison to surface water, with fewer water treatment steps required.
Surface water treatment works take water from above ground sources, such as rivers, lakes and reservoirs. This raw water is subject to direct environmental input. As a result, multiple treatment steps are required and individual processes are required that will enable the configuration of different comabinations to clean and finally disinfect the abstracted water.
Some water supplies may contain disinfection by-products, inorganic chemicals, organic chemicals and radionuclides. As a result, specialised water treatment methods may also be part of water treatment to help control formation and removal.
Furthermore, under renewed regulations, tighter limits could be placed on endocrine disrupting chemicals as well as lead limits being halved.
How does the water treatment process work?
Coagulation, flocculation and sedimentation are processes used to remove colour, turbidity, algae and other microorganisms from surface waters.
Chemical coagulants can be added to the water for the formation of a precipitate, or floc to entrap these impurities. After sedimentation and/or filtration, the floc is separated from the treated water
Aluminium sulphate and ferric sulphate are two of the most commonly used coagulants , although others are available. Raw water quality near to the inlet of a mixing tank or flocculator determines the rate in which coagulants are dosed in solution.
By adding coagulant at a point of high turbulence, it is rapidly and thoroughly dispersed on dosing. The next stage is the sedimentation tank. Here aggregation of the flocs takes, which settle out to form sludge that will need to be removed.
One of the advantages of coagulation ais that it reduces the time required to settle out suspended solids. Furthermore, it can be very effective in removing fine particles that are otherwise very difficult to remove.
The cost and the requirement for accurate dosing, thorough mixing and frequent monitoring, are often cited as the principal disadvantages of using coagulants for treatment of small supplies. Bench scale coagulation tests can be used to determine which coagulant to use for a specific raw water.
As a result, to remove colour and turbidity, coagulation and flocculation are considered the most effective treatment techniques. However, for small water supplies they may not be suitable. This is due to the level of control required and volumes of sludge generated.
Six essential Water treatment technologies
A variety of water treatment technologies are needed to work together, in sequence, in order to purify raw water before it can be distributed. Here is a list of basic technologies often used in water treatment works.
- Screens
Screens are used on many surface water intakes to remove particulate material and debris from raw water. Weeds and debris can be removed using coarse screens, whereas smaller particles including fish can be removed using band screens and microstrainers. Ahead of coagulation or subsequent filtration, microstrainers are used as a pre-treatment to reduce solids loading.
- Gravel filters
Turbidity and algae can be removed using gravel filters, which consist of a rectangular channel or a tank divided into several sections and filled with graded gravel (size range 4 to 30mm). An inlet distribution chamber allows the raw water to enter through and flow horizontally through the tank, encountering first the coarse and then the finer gravel. An outlet chamber collects the filtered water with solids being removed from the raw water accumulate on the floor of the filter.
- Slow sand filters
Turbidity, algae and microorganisms can also be removed using slow sand filters. A simple and reliable process, slow sand filtration is often suitable for the treatment of small supplies provided that sufficient land is available. Slow sand filters usually consist of tanks containing sharp sand (size range 0.15-0.30mm) to a depth of between 0.5 to 1.5m.
- Activated carbon
Using physical adsorption, contaminants can be removed using activated carbon. This will be affected by the amount and type of the carbon, the nature and concentration of the contaminant, retention time of water in the unit and general water quality (temperature, pH, etc.).
One of the mocst common mediums is granular activated carbon (GAC), although powdered activated carbon (PAC) and block carbon are also sometimes used. Filter media is contained in replaceable cartridges and a particulate filter at the outlet of the cartridge is used to remove carbon fines from the treated water.
- Aeration
Aeration is designed to transfer oxygen into water and remove gases and volatile compounds by air stripping. A common method is packed tower aerators as a result of their compact design and high energy efficiency. To achieve air stripping various techniques can be used including counter current cascade aeration in packed towers, diffused aeration in basins and spray aeration.
- Membrane processes
Reverse osmosis (RO), ultrafiltration (UF), microfiltration (MF) and nanofiltration (NF) are the most commonly used membranes for water treatment processes. Previously applied to the production of water for industrial or pharmaceutical applications, membranes are being applied to the treatment of drinking water. Membrane processes can provide adequate removals of pathogenic bacteria, Cryptosporidium, Giardia, and potentially, human viruses and bacteriophages. In a notable case study, companies from the Netherlands and Denmark are working on integrating enzymes into membrane technology for the removal of pesticides and pharmaceutical residues from drinking water.
UV water treatment: shining a light on disinfection
Invisible to the human eye, ultraviolet (UV) light can be used to disinfect microorganisms in water treatment processes. The wavelengths of UV light range between 200 and 300 nanometers (billionths of a meter). Ultraviolet radiation is produced at 254 nm from special low-pressure mercury vapor lamps. This is the optimal wavelength for disinfection and ozone destruction. Categorised as germicidal, this means they are capable of inactivating microorganisms, such as bacteria, viruses and protozoa. It's important to note that UV lamps never have contact with the water; they can be mounted external to the water which flows through UV transparent Teflon tubes or housed in a quartz glass sleeve inside the water chamber.
How does it work? The wavelengtsh of UV light render bacteria, viruses and protozoa incapable of reproducing and infecting.
UV disinfection can be used for the primary disinfection technology of potable drinking water. Futhermore, the process can also be used as a secondary form of disinfection. For example, against microorganisms, such as Cryptosporidium and Giardia, which can be chlorine-resistant.
In addition, UV light (either alone or in conjunction with hydrogen peroxide) can destroy chemical contaminants such as pesticides, industrial solvents, and pharmaceuticals through a process called UV-oxidation.
Under ideal conditions, a UV unit can provide greater than 99% reduction of all bacteria. However, even with this performance, ultraviolet disinfection has two potential limitations: “point” disinfection and also cells not being removed.
"Point" Disinfection can occur if the UV units only kill bacteria at one point in a watering system and do not provide any residual germicidal effect downstream. If just one bacterium passes through unharmed (100% destruction of bacteria cannot be guaranteed), there is nothing to prevent it from attaching to downstream piping surfaces and proliferating.
Secondly, a second limitation can be if bacteria cells are not removed in a UV unit but are converted into pyrogens. The killed microorganisms and any other contaminants in the water are a food source for any bacteria that do survive downstream of the UV unit.
One notable development to UV systems is the scaling up of light-emitting diode technology, known as UV-LED, with 2018 witnessing a tipping point on power density and purchasing price.
Ozone water treatment: harnessing the power of lightning
Like a lightning storm, ozone is created when oxygen is exposed to the discharge of a powerful electric current through air. While widely used in Europe for many years to treat municipal drinking water, it has not had a similar acceptance in the US.
Ozone can be used throughout water treatment, for example during pre-oxidation, intermediate oxidation or final disinfection as it has excellent disinfection and oxidation qualities. Usually, it is recommended to use ozone for pre-oxidation, before a sand filter or an active carbon filter (GAC). Following ozonization these filters can remove the remaining organic matter (important for final disinfection).
Ozonation is carried out by an electric discharge field as in the CD-type ozone generators, or by ultraviolet radiation (UV-type ozone generators). Ozone can also be achieved through electrolytic and chemical reactions, in addition to commerical methods.
In general, an ozonation system includes passing dry, clean air through a high voltage electric discharge, i.e., corona discharge, which creates and ozone concentration of approximately 1% or 10,000 mg/L. In treating small quantities of waste, the UV ozonation is the most common while large-scale systems use either corona discharge or other bulk ozone-producing methods.
Raw water is then passed through a venturi throat which creates a vacuum and pulls the ozone gas into the water or the air is then bubbled up through the water being treated. Since the ozone will react with metals to create insoluble metal oxides, post filtration is required.
Ozone is highly reactive and, as a result, has a very short half-life once dissolved into water. The natural reaction is for ozone to return to its oxygen form, with a reaction time typically taking 10-20 minutes at 20 degrees Celsius.
Advantages to ozone water treatment include the minimisation of inorganic, organic and microbiological problems and taste and odour problems. Furthermore, no additional chemicals are added to the water.
Meanwhile disadvantages include a lack of germicidal or disinfection residual to inhibit or prevent growth. Furthermore, the system may require pre-treatment for hardness reduction.
Types of water treatment chemicals (and why they are used)
Chemical disinfection of drinking-water includes any chlorine-based technology, such as chlorine dioxide, as well as ozone, some other oxidants and some strong acids and bases. Except for ozone, proper dosing of chemical disinfectants is intended to maintain a residual concentration in the water to provide some protection from post-treatment contamination during storage.
Disinfection of household drinking-water in developing countries is done primarily with free chlorine, either in liquid form as hypochlorous acid (commercial household bleach or more dilute sodium hypochlorite solution between 0.5% and 1% hypochlorite marketed for household water treatment use) or in dry form as calcium hypochlorite or sodium dichloroisocyanurate. This is because these forms of free chlorine are convenient, relatively safe to handle, inexpensive and easy to dose.
Chlorine is the most widely used primary disinfectant and is also often used to provide residual disinfection in the distribution system. Monitoring the level of chlorine in drinking water entering a distribution system is normally considered to be a high priority (if it is possible), because the monitoring is used as an indicator that disinfection has taken place. Residual concentrations of chlorine of about 0.6 mg/l or more may cause problems of acceptability for some consumers on the basis of taste.
Chlorine dioxide breaks down to leave the inorganic chemicals chlorite and chlorate. These are best managed by controlling the dose of chlorine dioxide applied to the water. Chlorite can also be found in hypochlorite solution that has been allowed to age.
Proper dosing of chlorine for household water treatment is critical in order to provide enough free chlorine to maintain a residual during storage and use. Recommendations are to dose with free chlorine at about 2 mg/l to clear water (< 10 nephelometric turbidity units [NTU]) and twice that (4 mg/l) to turbid water (> 10 NTU).
Monochloramine, used as a residual disinfectant for distribution, is usually formed from the reaction of chlorine with ammonia. Careful control of monochloramine formation in water treatment is important to avoid the formation of di- and trichloramines, because these can cause unacceptable tastes and odours.
A number of other chemicals may be added in treatment. These include substances such as sodium hydroxide for adjusting pH and, in certain circumstances, chemicals for fluoridation of drinking-water.
Contaminants of Emerging Concern including Pharmaceuticals and Personal Care Products
Contaminants of emerging concern (CECs), including pharmaceuticals and personal care products (PPCPs), are increasingly being detected at low levels in surface water, and there is concern that these compounds may have an impact on aquatic life. It is important for EPA to be able to evaluate the potential impact of CECs and PPCPs on aquatic life and have an approach for determining protective levels for aquatic organisms.
These chemicals have features that require additional consideration when applying existing ambient water quality criteria for the protection of aquatic life, using EPA’s 1985 Guidelines for Deriving Numerical National Water Quality Criteria for the Protection of Aquatic Life and Their Uses.
There are many CECs and PPCPs that act as so-called endocrine disruptors (EDCs). EDCs are compounds that alter the normal functions of hormones resulting in a variety of health effects. EDCs can alter hormone levels leading to reproductive effects in aquatic organisms, and evaluating these effects may require testing methodologies not typically available along with endpoints not previously evaluated using current guidelines.
The emerging contaminants may also demonstrate low acute toxicity but cause significant reproductive effects at very low levels of exposure. In addition, the effects of exposure to aquatic organisms during the early stages of life may not be observed until adulthood. Therefore, traditional toxicity test endpoints may not be sufficiently comprehensive for criteria derivation for these chemicals and the chemicals may also have specific modes of action that may affect only certain types of aquatic animals (e.g., vertebrates such as fish).
Therefore, EPA developed a White Paper Aquatic Life Criteria for Contaminants of Emerging Concern: Part I Challenges and Recommendations detailing the technical issues and recommendations to serve as a basis for modifying the 1985 guidelines. These modifications should enable the Agency to better address CECs and develop ambient water quality criteria when appropriate for protection of aquatic life that makes the best use of available science.
EPA’s Office of Water asked the Science Advisory Board (SAB) for advice on the scientific merit of a white paper that identifies and addresses technical issues in deriving aquatic life criteria for emerging contaminants such as pharmaceuticals and personal care products exhibiting endocrine disrupting activity or other toxic mechanisms.
Perfluoroalkyl Compound Plume at the Lakewood Industrial Park, New Jersey
Perfluoroalkyl compounds (PFCs) are a family of persistent emerging contaminants with widespread environmental occurrence in a variety of media, including aquatic systems (Ahrens 2011, Ferrey et al. 2012, Post et al. 2013). They have unique properties that make them useful in a wide range of products and industrial applications (Lindstrom et al. 2011, Post et al. 2012). PFCs are soluble in water, which aids their ability to disperse in the environment (Eschauzier et al. 2012, NJDEP 2014). PFCs are currently unregulated contaminants in drinking water. The U.S. Environmental Protection Agency (USEPA) has developed a Public Health Advisory of 0.4 micrograms per liter (μg/L) (400 nanograms per liter (ng/L)) for short-term (defined by USEPA IRIS as up to 30 days) exposure to perfluorooctanoic acid (PFOA), one of the most common PFC compounds found in the environment (USEPA 2009). The New Jersey Department of Environmental Protection (NJDEP) has also issued health-based drinking water guidance level of 0.04 ug/L (40 ng/L). NJDEP’s guidance is intended to protect from chronic (lifetime) exposure, normally defined as 70 years, consistent with other New Jersey drinking water guidance values, drinking water standards, and ground water standards (Post et al. 2009)
In a 2009 statewide study of perfluoroalkyl compound (PFC) occurrence in public water supplies conducted by NJDEP, the concentration of the PFC compound perfluorooctanoic acid (PFOA) was higher in a drinking water intake along the South Branch Metedeconk River in Ocean County than in the other raw surface water sources tested. The Brick Township Municipal Utilities Authority (BTMUA), which relies on the Metedeconk River as its primary source of water supply, subsequently initiated a PFC source track down study in collaboration with the NJDEP Division of Science, Research, and Environmental Health. The data collected from a series of sampling events show that low levels of various PFCs are present in the study area and likely originate from a number of sources. However, BTMUA documented a localized area of high-level PFC contamination along the South Branch Metedeconk River in Lakewood Township. A groundwater contamination plume emanating from an industrial park on the south side of the river is suspected to be the principle source of PFCs observed in the Metedeconk River and the BTMUA intake samples. Groundwater PFOA levels were found to be as high as 70,000 ng/L in this area. While various PFCs were detected in water samples throughout the study area, and particularly in groundwater samples, PFOA is the primary contaminant of concern with respect to South Branch Metedeconk River water quality and the BTMUA water supply.
Discussion
The primary PFC found in the BTMUA drinking water intake was PFOA, and PFOA was also the primary PFC found high in concentrations in the study samples. Various other PFCs were detected in water samples throughout the source track down study area and were most pronounced in the groundwater samples.
During the course of the study, numerous environmental records and databases were reviewed and field surveys were conducted to identify and document any indications of dumping, negligent business practices, or poor housekeeping. Several suspected illicit discharges were identified, including process water from granite manufacturing facilities, recycled water from a commercial car wash, and vehicle wash water from the lots of large commercial auto dealerships. The information gathered offered few leads as to the PFC contamination source. Specific leads were either rejected based upon the sampling results or deemed insignificant given their magnitude relative to the observed PFC levels in the South Branch Metedeconk River. The locations of groundwater samples with extremely high PFC concentrations were used to isolate the mostly likely PFC sources to the parcel level. The plume likely originates in the Lakewood Industrial Park, and a small location within this industrial park has been identified as a probable source. Based upon the assumption that groundwater in the area generally follows the surface topography and flows towards the River, the contamination source is most likely confined to one of three possible properties located in Lakewood Township on the south side of Swarthmore Avenue and east of Lehigh Avenue.
A facility located on these three properties manufactures industrial fabrics, composites, and elastomers, and uses or produces products that contain PFCs. In light of the groundwater sampling data, and in comparison to the other facilities in the area, this facility appears to be the most probable source and warrants further investigation. However, some other, as yet unknown source cannot be ruled out. In the event that groundwater flow assumptions are incorrect for this area, several other properties align with the groundwater plume area on the north side of Swarthmore Avenue. None of those properties appear to be manufactures or user of PFC compounds. It is unclear how long PFCs have been contaminating the groundwater in this area. However, during Sampling Event 8, a split sample from one site was analyzed by USEPA’s National Exposure Research Laboratory in North Carolina and evaluated for the presence of branched and linear PFCA isomers (Strynar and Lindstrom 2013). The presence of both linear and branched isomers and the presence of both even and odd numbered carbon chains suggest an older source of contamination (Benskin et al. 2012, Strynar 2014).
The process that produced both branched and linear isomers and a relative mix of even and odd numbered carbon chains, known as electrochemical fluorination, was the dominant manufacturing process between the 1950s and 2002 and has since been phased out. Upon completion of this study and receipt of the final report, the NJDEP Site Remediation and Waste Management Program has contacted the potential responsible party to take appropriate remedial action