English

The development of public water systems and the crisis in Flint

The provision of clean, safe drinking water has been a fundamental concern of civilizations since the time of the ancients. Access to water is a basic requirement for modern life, central to a wide range of activities from agriculture to industry to personal hygiene and cooking.

In the developed countries, the past two centuries, in particular, have seen enormous advances in the treatment and delivery of water. This has been accompanied by radical improvements in public health and the quality of life. Advancements in the treatment of water have curtailed the spread of deadly diseases such as typhoid and cholera and helped to dramatically increase life expectancy.

So it was with a combination of shock and horror that millions throughout the world reacted to the water crisis in Flint, Michigan, once the center of the General Motors empire. The contamination of the city’s water supply with lead and the poisoning of Flint’s 100,000 residents brought back images of the dark times before the development and application of modern methods for scientifically testing and treating water.

The issue in Flint was not lack of scientific knowledge or technological expertise, but a calculated decision by the authorities at all levels to put the population at risk in the face of known dangers. Local officials, with the complicity of the state government and the federal Environmental Protection Agency, made the decision to use highly corrosive water from the Flint River in place of the city’s longtime water source without applying corrosion controls. The Flint River water leached lead from the city’s antiquated piping, leading to the contamination of the water supply.

Flint River water is also linked to an outbreak of Legionnaires disease that caused at least 10 deaths.

The events in Flint are a sharp expression of a historical retrogression in the United States, where gains made by the working class in an earlier period are being stripped away. Despite enormous advances in scientific understanding and technology, the clock of history is being rolled back.

A review of the history of water treatment and distribution is valuable in order to place the present crisis within a wider perspective. From the dawn of civilization to the present, access to water has been a precondition for the progress of society.

Less than one percent of the earth’s water is suitable for drinking in its natural state. The rest is in the oceans or the polar ice caps. Some 98 percent of liquid fresh water is ground water, much of it very deep beneath the earth’s surface, making pumping expensive. Despite this, there is ample water to supply human needs given the development of modern technologies.

The earliest civilizations in Sumeria and Egypt grew up next to water sources, the Euphrates and Nile rivers. In ancient Greece and India, writings on water treatment methods date back to 2,000 BC. These methods included sand and gravel filtration, boiling and straining.

From 300 to 200 BC, Rome built its first aqueducts. Eventually eleven were constructed over a period of 500 years, covering 400 kilometers.

Despite these initial achievements, major advances in the treatment and distribution of water did not take place until the dawn of the industrial age. The very concept of public health was bound up with Enlightenment ideas, which championed knowledge and reason over faith and superstition. Social reformers began advancing the idea that improvements in public health had an economic value to society. [1]

However, initial progress was slow. While it was generally felt that good health was somehow bound up with access to clean water, the origin and transmission of disease was not understood.

In 1652, the city of Boston established the first water works in what is now the United States. It was used for fighting fires as well as for domestic purposes.

Early water systems used hollowed-out logs to distribute water. However, by the early 19th century, cast iron pipes started coming into use.

Philadelphia built the first US municipal water system beginning in 1801 in the wake of a devastating yellow fever epidemic. Water was piped into the city and was freely available to citizens at public hydrants. It was also the first city in the world, in 1804, to use cast iron pipes for its water mains.

In 1842, the Croton Aqueduct began serving New York City. It carried water 41 miles by force of gravity from the Croton River and came into use as local water supplies became polluted. In 1830, the city suffered from an astounding mortality rate of 2.6 percent (1 in every 39 people). In 1832, a cholera epidemic ravaged the city.

In 1869, the city of Chicago laid a tunnel two miles into Lake Michigan to draw the city’s water supply. The water system included a massive three-foot wide stand pipe to equalize water pressure in the city, which still stands today, known as the Chicago Water Tower.

It was, in fact, the industrial revolution and the crowding together of masses living in poverty in major centers like New York, London and Paris that gave impetus to the modern concept of public health.

“A substantial mortality penalty to living in urban areas therefore developed, as American cities grew during the 19th Century... In seven states with good data before 1900, urban mortality was 30 percent higher in cities than rural areas in 1890... the gradient was steeper for children.” [2]

In 1842, Sir Edwin Chadwick, a British social reformer, published a document entitled, “Report into the Sanitary Conditions of the Labouring Population of England,” demonstrating that life expectancy was much lower in the towns than in the countryside. He argued that through reforms, people’s lives could be improved and a healthy population would be more productive. Among the reforms he recommended were measures to improve the drainage of streets and supply clean fresh water to the population of cities.

In 1849, Dr. John Snow in Britain published a paper in which he linked the transmission of cholera to contaminated water supplies. In 1854, he traced the outbreak of a cholera epidemic in London to a public pump. He went on to document in detail the relationship between the quality of the water source and the instance of cholera. For example, he showed that neighborhoods that drew water from polluted reaches of the Thames had a higher instance of the disease. His findings led to significant changes in the water and waste disposal system in London, which spread to other cities.

Poughkeepsie, NY in 1872 began the first widespread use of water filtration in the US, and in 1897, Maidstone, England became the first city in the world to treat all of its water supply with chlorine.

In 1875, Britain passed a Public Health Act that, among other things, required all new construction to contain running water and a drainage system. Local authorities had to appoint medical officers in charge of public health.

An outbreak of typhoid in France in 1882 led the Paris Municipal Council to press for the connection of all wastewater outlets to the city’s sewage system.

By the dawn of the 20th century, public health was in its infant stages. Waterborne diseases such as typhoid still killed tens of thousands. “Indoor plumbing was rare, especially in the countryside, and in cities it was inadequate at best. Tenements housing as many as 2,000 people typically had not one bathtub. Raw sewage was often dumped directly into streets and open gutters; untreated industrial wastes went straight into rivers and lakes, many of which were sources of drinking water; attempts to purify water constantly fell short, and very few municipalities treated wastewater at all.” [3]

With the discovery by German bacteriologist Robert Koch that disease spread through the passing of microbes via air and water, radical changes were made in the advanced industrial countries. This involved changes in the disposal of waste and better treatment and filtration of public water supplies.

However, despite the proven advantages, social reformers faced enormous resistance to improvements in water systems, which involved expenditure of large sums of money. Nevertheless, over the course of the next decade, most major American cities implemented water filtration and chlorination.

In 1907, Chicago shut off its last sewer output into Lake Michigan, and in 1908, the Jersey City Water Works began the first large-scale chlorination of water at the Boonton Reservoir. That same year, Chicago began water chlorination at its Bubbly Creek plant.

By 1913, the city of Los Angeles completed an aqueduct running 230 miles from the Owens River in the Sierra Nevada to supply the city with fresh water.

The results were dramatic. By 1918, over 1,000 US cities were chlorinating their water supplies and by 1923, the typhoid death rate had dropped by 90 percent from the level one decade earlier. Between about 1900 and 1940, mortality rates in the United States and other major industrial countries fell by the greatest amount in history.

According to one study [4], half of the reduction in mortality was due to improvements in public water supplies. Access to clean water led to the near eradication of typhoid fever: “Filtration and chlorination together reduced typhoid fever mortality by 25 percent, total mortality by 13 percent, infant mortality by 46 percent and child mortality by 50 percent.” (See chart).

It wasn’t until 1914 that the US Public Health Service implemented the first regulation of drinking water quality. It applied only to contaminants capable of causing infectious disease. Community water systems were not required to comply with the standards, but most did. The rules set limits to the amount of bacteria permitted in domestic water supplies, but did not regulate chemical pollutants.

The standards were upgraded in 1925. This time, the rules set limits on lead, copper and zinc, as well as soluble minerals. The standards were limited to water systems that supplied interstate carriers, only about 2 percent of the US water systems. The Public Health Service again revised drinking water standards in 1942, 1946 and 1962.

With the progress in halting waterborne communicable diseases, attention turned to other dangers. The US began the widespread use of lead water pipes in the late 1800s. By 1900, more than 70 percent of cities in the US with more than 30,000 people used lead water pipes. Lead, though more expensive than iron, had two advantages: it lasted longer than iron and was more malleable.

The dangers were recognized as far back as ancient times. By the late 1800s, warnings were raised in the United States. By the 1920s, many towns had banned the use of lead pipes. [5]

However, the use of lead pipes for water distribution continued long after they were pinpointed as a health risk due to intense lobbying by the lead industry and continued government inaction.

The use of lead piping continued despite a wealth of documentation of the harmful effects of lead water lines. For example, in 1890, the Massachusetts board of health advised cities and towns to avoid the use of lead water pipes after a well-documented case of mass lead poisoning due to tainted water in the town of Lowell, where city officials had changed the city’s water supply to a highly corrosive source. By the 1920s, many state and local plumbing codes were revised to ban or limit the use of lead piping.

According to one report, by 1900 more than 70 percent of cities with a population greater than 30,000 used lead water lines.[6]

In the 1920s, the National Lead Company ran ads extolling the supposed health benefits of lead. Founded in 1928, the Lead Industries Association (ILA) undertook an intensive lobbying campaign on behalf of lead manufacturers to promote the use of lead water lines as well as lead-based paint. The ILA made an extensive effort to reverse the downward trend in the use of lead pipes by lobbying plumbers’ groups, local administrations and federal agencies. It tried to cast doubts on reports documenting the negative health effects of lead. It even drafted plumbing codes mandating the use of lead pipes.

The campaign paid off. The state of Pennsylvania mandated the use of lead piping. Massachusetts repealed its ban on the use of certain types of lead pipes. ILA lobbyists even succeeded in getting lead included in US federal government specifications for plumbing. Lead company representatives visited federal construction projects to persuade builders on the advantages of lead.

In 1952, the ILA published a book extolling the use of lead in water lines. Despite this, the lead industry was well aware of public health concerns over the use of lead. One ILA official cautioned in 1959, “The toxicity of lead poses a problem that other nonferrous industries do not face. Lead poisoning, or the threat of it, hurts our business in several different ways. While it is difficult to count exactly in dollars and cents, it is taking money out of your pocket every single day.” [7]

By the late 1960s however, it became apparent that additional oversight was needed. Now not only were there aesthetic problems, pathogens and naturally occurring chemicals, but also manmade toxins and chemicals caused by the industrial and agricultural advances of the time.

The Public Health Service conducted a survey in 1969 that produced some sobering results: only 60 percent of the nation’s water systems delivered water that met the Public Health Service’s standards. A 1972 study found 36 chemicals in treated water taken from treatment plants along the Mississippi River.

The lead industry continued its lobbying efforts until at least 1972. A 1984 survey by the Environmental Protection Agency reviewed 153 public water systems in the US. Of those surveyed, 112 said they had in the past installed lead water systems and five stated that lead had been permitted well past 1930. A number of major cities including Boston, Chicago, San Diego and Philadelphia continued to allow the use of lead.

It was not until 1986 that the United States federal government finally banned the use of lead piping nationwide. Flint itself still allowed the use of lead piping until the time of the 1986 ban.

Decades of delay in banning and eradicating lead piping have had a tragic impact. There are an estimated three to six million miles of lead piping still being used in the United States. Unsafe lead levels are being discovered in city after city.

What is required is a nationwide, multi-billion-dollar infrastructure project to replace lead service lines and plumbing in order to provide safe drinking water to all. This will never be accomplished as long as private interests dominate every facet of economic and social life. The crisis in Flint is a particularly devastating expression of the catastrophe produced by the subordination of society to the profit drive of capitalism.

Notes

[1] “The Role of Public Heath improvements in Health Advances: The 20th Century,” David Cutler and Grant Miller

[2] Cutler and Miller

[3] National Academy of Engineering, “Clean Water Challenge”

[4] Cutler and Miller

[5] American Journal of Public Health, September 2008

[6] “The lead Industry and Lead Water Pipes, a Modest Campaign,” by Richard Rabin

[7] “The lead Industry and Lead Water Pipes, a Modest Campaign,” by Richard Rabin

Loading