by Patricia Burke of Safe Tech International, with Kate Kheel Disclaimer: I am not an expert on the hydrological cycle, but it appears that mainstream news reports about Utah’s Great Salt Lake disappearing, focusing on farmers and climate, maybe missing the bigger picture.
Great Salt Lake is the largest saline lake in North America. The lake and its wetlands form a
keystone ecosystem that supports biodiversity and human economy throughout the Western Hemisphere. – Source
Framing Others
In Massachusetts and beyond, the Karen Reed murder trial is a made-for-reality-television- story that has divided a town and captivated the public’s attention CNN reported, “Her defense has accused off-duty police inside that Canton home of killing O’Keefe and framing Read.” The prosecution’s case [] has been hampered by a series of missteps and unusual investigative practices. Rita Lombardi, a local Read supporter outside the court, told CNN [] “This woman was framed, and there is overwhelming evidence, overwhelming scientific evidence that she was framed for murder by the very people we trust to protect us.” Central to the trial is the question of how the police department conducted its investigation, and whether Karen Reed has been framed. The case is on-going.
This article is not as titillating of a story as the Karen Reed trial – but asks a few questions.
Regarding the disappearing Great Salt Lake in Utah, what if the farmers and ‘climate’ are being framed, by the very people we trust to protect us? Are we unwittingly relying on ‘unusual investigation practices’?
My colleague Kate Kheel recently sent me an email. “On the way to Baltimore today, I listened to NPR’s coverage of The Great Salt Lake that is projected to be completely dry within 5 years. The reasons given included global warming and agriculture.
But surprisingly (not), the huge NSA data center erected in 2014 for surveillance purposes, and Facebook’s META were not mentioned. That got me to researching.”
Do we have an incomplete picture of what is driving what we call climate change?
“By computing the water usage rate, one could ultimately determine the computing power and capabilities of the Utah Data Center,” wrote the NSA’s associate director for policy and records, David Sherman, in an undated letter filed with Bluffdale in response to the Tribune’s public records request. “Armed with this information, one could then deduce how much intelligence NSA is collecting and maintaining.”
Who Counts What, – Counts
In November2023, in the article Military emissions – the weak point in the war on climate change? Army Technology reported that international climate accords do not take into account military emissions. ‘As the largest institutional consumer of energy in the US, the Department of Defense (DoD) has faced criticism for not instigating change on military emissions reporting.'[] “The US has a larger military expenditure than any other country, amounting to $877bn in 2022, according to the Stockholm International Peace Research Institute.” [] Scientists and academics from universities including Oxford, Columbia, Harvard, Lancaster, Stanford, Durham and Queens Mary have released numerous studies outlining the hundreds of millions of tonnes of military carbon emissions that have gone unaccounted for in the stocktake.”– SOURCE
Likewise, when we look at “farmers” and water consumption, are we also factoring in water-guzzling data farms?
In addition to ‘not counting’ military carbon emissions, how does the planet’s hydrological cycle respond when accurate data about data centers is not factored into consideration? Are we causing avoidable planetary desertification by not accurately counting technology’s footprint?
In reverse chronological order, some of the resources that Kate uncovered point to a possible scenario; the excessive water consumption of data centers via evaporation differs from farming consumption patterns and may be changing the climate and contributing to or causing drought, and not just in Utah,
AI’s Outrageous Environmental Toll: July 2024 (Futurism and Wired)
On July 14, 2024 in the Substack News Summary for Safe Tech International , we highlighted an article by Frank Landymore published by Futurism: AI’s Outrageous Environmental Toll Is Probably Worse Than You Think
He wrote, “Consider the obscene amounts of water that’s needed just to cool the data centers that train and host generative AI models, which is somewhere in the millions of gallons per year. Internal estimates from Microsoft about its data facility in Goodyear, Arizona, for example, show that it’s set to annually consume 56 million gallons of drinking water — which is more than a drop in the ocean for such a water-scarce region.
But as Wired reports, the way data centers waste water is even worse than how households would waste it by leaving the tap running. “The water that is available for people to use is very limited,” Shaolei Ren, a responsible AI researcher at UC Riverside, told the magazine. “It’s just the fresh surface water and groundwater. Those data centers, they’re just evaporating water into the air.” “When we get the water from the utility, and then we discharge the water back to the sewage immediately, we are just withdrawing water — we’re not consuming water,” Ren continued. “A data center takes the water from this utility, and they evaporate the water into the sky, into the atmosphere.” And once evaporated, that water doesn’t come back to Earth for another year.
“For basic services, those were very light in terms of the amount of data that needed to go back and forth between the processors,” he said, estimating that these generative AI applications are 100 to 1,000 more times intensive. Other estimates put Google’s AI search in the ballpark of consuming ten times as much energy than a regular one.“ To re-iterate:
Those data centers, they’re just evaporating water into the air.” “A data center takes the water from this utility, and they evaporate the water into the sky, into the atmosphere.” And once evaporated, that water doesn’t come back to Earth for another year.
The quote is sourced from a July 11th Wired article. AI’s Energy Demands Are Out of Control, Welcome to the Internet’s Hyper-consumption Era. Generative Artificial Intelligence tools, now part of everyday user experience online, are causing stress on local power grids and mass water evaporation, by Reece Rogers.
March 2024, Great Salt Lake Utah, (NPR)
Farmers accused of drying up the imperiled Great Salt Lake say they can help save it In a lawsuit filed last year against the state, Deed’s group (Center for Biological Diversity) and others went on to call the drying Great Salt Lake — the American West’s largest and one of the world’s few remaining giant saline lakes — a looming ecological collapse.
That suit came on the heels of a report by Brigham Young University scientists that set off alarm bells for many in the state saying the lake could dry up within five years if no action is taken. The lawsuit seeks to force state leaders to drastically cut how much water farmers use and let it flow into the lake.
But as Spring planting time approaches, many farmers say fears about the lake drying up are being overblown. Environmentalists want to stop alfalfa farming in the desert. The Great Salt Lake’s shrinking is already killing the brine shrimp that feed migratory birds. []
Seed says Utah leaders just aren’t making the tough political choice to cut water to alfalfa farms and dairies north of the lake. “We can’t afford to continue to do that with our water in the West,” she says. “Those farmers need to be compensated so they can retire from farming.” This is a popular refrain right now from California to Utah where precious river water channeled through irrigation canals is used to grow thirsty feed crops, some for export, even as droughts in the West are getting worse with climate change. Utah is the second driest state in the country yet it also has the highest water use per capita. But farmers say they’re being unfairly maligned in the fight over the imperiled lake. []
January 2023, Brigham Young University
The NPR article links to the BYU study Emergency measures needed to rescue
Great Salt Lake from ongoing collapse (Download available, 34 pages) The BYU study states “Causes of decline After millennia of natural fluctuations, human water use has pushed Great Salt Lake into structural decline.
Since 2020, the lake has lost just over one million acre-feet of water each year [], much more than predicted by current hydrographic models [] If this rate of water loss continues, the lake would be on track to disappear in the next five years. The lake is now 10 feet and 6.9 million acre-feet below its minimum healthy level, which has only been attained once since 2002[]
Saline lakes are highly vulnerable to water overuse because they depend on a delicate balance between streamflow and evaporation. Consequently, there is a strong relationship between the area of irrigated agriculture in a saline lake’s watershed and the severity of its shrinkage Agriculture began affecting Great Salt Lake levels in the mid 1800s. However, it wasn’t until the 1900s that humans became the dominant force controlling the lake. Federal and state construction of dams, canals, and pipelines in the 1900s allowed more of the watershed’s natural runoff to be diverted for agricultural, industrial, and municipal use. These subsidized water projects led to unsustainable water consumption and plummeting lake levels through the 1960s.
Over the last three years, the lake has received less than a third of its natural streamflow because of excessive water diversions. In 2022, the lake dropped to a record elevation of 4188’—the lowest level on the state’s contingency charts10,18,64. The depletion of water is even more severe than it appears because groundwater is not included in these estimates. Approximately 26 million acre-feet have been lost from the lake itself, but twice that amount may have been lost from the aquifers around the lake due to water table drop. These empty aquifers could slow the rate of rebound after runoff is increased.” –BYU Study
February 2024: AI, West Des Moines, Iowa (Hyperscaler)
In February, HyperScaler published, Water Consumption of AI, How Tech Giants are Draining the Planet 2024.
The Extent of Water Usage by AI According to a recent study in Nature, the water consumption of AI is projected to reach between 4.2 billion and 6.6 billion cubic meters by 2027. This is equivalent to nearly half of the UK’s annual water consumption. The primary reason for this high water usage is the need to cool the servers that run complex AI models, especially those utilizing generative AI for processing and generating text, data, and other forms of information. These AI models operate in massive data centers, where chilled water systems absorb the heat generated by the servers. While some of this water can be recycled, a significant amount evaporates, leading to substantial water consumption.
Additionally, the production of electricity required to power these data centers indirectly contributes to water usage through energy production methods such as thermal power plants and hydroelectric dams. The growing water consumption by AI data centers has several negative repercussions on both the environment and society.
Environmental Impact The excessive use of water by data centers exacerbates the global water crisis, which already affects over two billion people who lack access to safe drinking water. Furthermore, it intensifies the effects of climate change, contributing to severe droughts, floods, and wildfires that jeopardize ecosystems and human communities. Social and Economic Impact The strain on water resources can lead to conflicts and inequalities, particularly in regions where water is already scarce. For instance, in West Des Moines, Iowa, a cluster of data centers hosting OpenAI significantly depleted the local aquifer, reducing water pressure and impacting residents’ quality of life and property values. Additionally, popular AI services like OpenAI’s ChatGPT are heavy water users. Research indicates that generating responses from ChatGPT could consume as much water as a 500ml bottle for every 10 to 50 prompts, depending on various factors. Newer, more powerful models like GPT-4 are likely to consume even more water, though exact figures are not yet available. –Hyperscaler
The growing water consumption by AI data centers has several negative repercussions on both the environment and society. The excessive use of water by data centers exacerbates the global water crisis, which already affects over two billion people who lack access to safe drinking water. Furthermore, it intensifies the effects of climate change, contributing to severe droughts, floods, and wildfires that jeopardize ecosystems and human communities.
July 2022 Utah (Standard Examiner)
Yes, data centers use a lot of water. One Utah company shows it doesn’t have to be that way.
“The United States accounts for about 40% of the world’s large cloud and internet data sites. Utah is home to at least 25 “colocation” data centers, according to a Data Center Map estimate, most of them near Salt Lake City. Those sites, like Novva, rent servers to a variety of customers, from gas stations to medical companies. The figure doesn’t include big centers run by single entities, like the U.S. National Security Agency’s operation in Bluffdale or Facebook’s facility in Eagle Mountain.
And most data centers, regardless of their operators, rely on evaporative cooling to keep their servers at optimal temperatures. Some gulp down millions of gallons each month. “It’s a worse story than you can imagine,” Swenson said. “They’re using culinary water, not irrigation [water], so it’s been treated by the city. … It’s a complete waste of a resource.” [] Data centers further treat their water with chemicals to prevent things like scale buildup and Legionnaires’ disease. Some of the water is lost through the evaporative process, while the rest can be reused only a few times before it has to be flushed from the system. Unfortunately, Swenson said, “the places where evaporative cooling works the best are the places where there is the least amount of water.”
As climate change fuels drier conditions and the West’s current “megadrought” shows no signs of ending, some of those arid communities are protesting water-guzzling data facilities. In the early 2010s, Utahns worried so much about the NSA data center’s water consumption — along with Edward Snowden’s revelations that the agency was illegally spying on U.S. citizens — that a Republican legislator proposed shutting off its supply. [] Facebook’s parent company, Meta, has adopted sustainability goals for its own data centers. Still, it consumed 13.5 million gallons in the year between June and May at its 970,000-square-foot Utah site, according to information provided by Eagle Mountain.
The C7/DataBank center in Bluffdale — which is a tenth the size of the NSA and Facebook campuses — used 6.9 million gallons in the past year. Novva, meanwhile, has consumed about 1.1 million gallons since its servers went on line in January, according to West Jordan. In May, it used 585,000 gallons — its highest monthly rate of consumption to date. By comparison, Facebook used 1.3 million that month and the NSA chugged nearly 13 million gallons. [] Why some say data centers are worth their weight in water Beyond water consumption, data centers don’t create many jobs. They also eat up a lot of energy [] Still, the Utah cities that welcome the facilities say they offer a lot of benefits. [] In a statement, an NSA spokesperson said the agency is working to limit water use and be a responsible member of the community. [] What about data centers’ high energy demands? Water improvements aside, data centers devour a lot of electricity. – SOURCE
June 2017 Mesa, Arizona (NBC news)
Drought-stricken communities push back against data centers
As cash-strapped cities welcome Big Tech to build hundreds of million-dollar data centers in their backyards, critics question the environmental cost. But keeping the rows of powerful computers inside the data center from overheating will require up to 1.25 million gallons of water each day, a price that Vice Mayor Jenn Duff believes is too high. “This has been the driest 12 months in 126 years,” she said, citing data from the National Oceanic and Atmospheric Administration. “We are on red alert, and I think data centers are an irresponsible use of our water.”
The spike in use of data-intensive cloud services such as video conferencing tools, video streaming sites like Netflix and YouTube and online gaming, particularly as people quarantined during the pandemic, has increased demand for the computing power offered by data centers globally. And this means more data centers are being built every day by some of America’s largest technology companies, including Amazon, Microsoft and Google and used by millions of customers. According to the Synergy Research Group, there were about 600 “hyperscale” data centers, massive operations designed and operated by a single company that then rents access to cloud services, globally by the end of 2020. That’s double the number there were in 2015. Almost 40 percent of them are in the United States, and Amazon, Google and Microsoft account for more than half of the total. – NBC news
The spike in use of data-intensive cloud services such as video conferencing tools, video streaming sites like Netflix and YouTube and online gaming, particularly as people quarantined during the pandemic, has increased demand for the computing power offered by data centers globally. And this means more data centers are being built every day.
July 2015: Utah (Sage Journals)
From Data flows and water woes: The Utah Data Center
“Using a new materialist line of questioning that looks at the agential potentialities of water and its entanglements with Big Data and surveillance, this article explores how the recent Snowden revelations about the National Security Agency (NSA) have reignited media scholars to engage with the infrastructures that enable intercepting, hosting, and processing immeasurable amounts of data. []
Specifically, I explore the NSA’s infrastructure and the million of gallons of water it requires daily to cool its servers, while located in one of the driest states in the U S [] this article questions the emplacement and impact of corporate data centers more generally, and the changes they are causing to the landscape and local economies. I look at how water is an intriguing and politically relevant part of the surveillance infrastructure and how it has been constructed as the main tool for activism in this case, and how it may eventually help transform the public’s conceptualization of Big Data, as deeply material.”
[] during the Obama Administration, the NSA has created a colossal network of backup hard drives ensuring that failure at one facility is easily recovered from another. The NSA spreads and duplicates storage across multiple locales, including in Georgia, Texas, Colorado, and Hawaii, with a sizeable expansion at its headquarters in Fort Meade, Maryland (scheduled to open in 2016). This $860 million 70,000 square feet of datacenter space in Fort Meade is estimated to require 60 MW of energy to run and makes huge demands on water; up to five millions gallons a day. This location would then detour wastewater (“gray waters”) from the Little Patuxent River, for the purposes of cooling its servers. – Sage Journals
March 2014 (Wired)
Why Does the NSA Want to Keep Its Water Usage a Secret? The National Security Agency has many secrets, but here’s a new one: the agency is refusing to say how much water it’s pumping into the brand new data center it operates in Bluffdale, Utah. According to the NSA, its water usage is a matter of national security. If it revealed how much water it’s using in Bluffdale, the agency believes, outsiders could get a good idea of the scope of NSA surveillance. “By computing the water usage rate, one could ultimately determine the computing power and capabilities of the Utah Data Center,” wrote the NSA’s associate director for policy and records, David Sherman, in an undated letter filed with Bluffdale in response to the Tribune’s public records request. “Armed with this information, one could then deduce how much intelligence NSA is collecting and maintaining.” But, oddly enough, water usage has become a very contentious issue for the NSA. An anti-government group called the Tenth Amendment Center is calling for Utah to simply cut off the NSA’s water supply, saying that water is the $1.5 billion data center’s “Achilles Heel.” And last month, a state Republican lawmaker named Marc Roberts said he would introduce a bill that would do such a thing. – Wired
2014: Victory! NSA water records to be released in Utah This month, the Utah State Records Committee ruled that the City of Bluffdale must release water records pertaining to the massive NSA data center located there. Salt Lake City Tribune reporter Nate Carlisle pursued the information, and his success shows how a series of small, seemingly insignificant actions can lead to a major victory. The committee voted unanimously to require the city to make details of the NSA’s water use public last week. “We felt the law was on our side,” Carlisle told KUTV News. “We also felt there was a public interest in knowing how much water the NSA is using in Utah, so Utahns are informed about the role of the NSA in their state.” The city of Bluffdale and the NSA were both initially unwilling to give up the details about their arrangement. Water usage estimates range between 1.2 and 1.7 million gallons of water per day, but the NSA was very ambiguous regarding the specific figure, insisting it was a matter of “national security.” See Short I minute video: No Water = No NSA Data Center. #NullifyNSA campaign from OffNow.org, https://www.youtube.com/embed/8ANUo8BnYoo
July 2024 Politico Did You Know? The Politics of Reclaiming “Safety“
In the July 25th Safe Tech International News and Notes we included an article from Politico: AI and POLITICS: What voters want on AI from Trump But what would voters actually want, if Trump won?
The article indicates that many Americans are questioning the wisdom of lax regulation, while re-thinking safety. Questioning the explosion of surveillance of U.S. citizens, fueling the demand for more data centers and more water and energy consumption, may be emerging more quickly from the right than the left.
“The Artificial Intelligence Polling Institute asked nearly 1,000 respondents last week, sharing the results exclusively with Digital Future Daily. The online poll asked them to rate the sometimes conflicting views that Trump allies and the man himself have expressed on AI. What they found might give pause to open-source acolytes and out-there accelerationists alike — and, perhaps unexpectedly, to the Republicans who are ready to line up behind Trump’s desire to repeal President Joe Biden’s sweeping AI agenda.
Those who responded to the poll, while often conflicted, have serious concerns about the safety of the technology itself. [] The poll also tackled an AI-adjacent issue that has become a key GOP talking point: AI’s thirst for energy. Trump recently suggested that Biden-era environmental regulations would stymie the technology’s development. A mere 33 percent of Trump voters agreed that “regulations should be eased on power generation” for AI, with 37 percent opposing such a move and 30 percent saying they weren’t sure. [] Asked whether Trump in a second term should prioritize keeping the U.S. ahead of China on AI or keeping Americans safe from it, they prioritized safety by a margin of 15 points. HERE
The Case Against Water Metering Infrastructure
In the 2023 report Emergency measures needed to rescue Great Salt Lake from ongoing collapse, researchers at Brigham Young University included a list of things NOT TO DO. The list includes the call not to: Build more infrastructure. There continue to be calls to build more reservoirs and pipelines as a response to the ongoing drought. However, our current situation is not caused by inadequate surface-water storage. In fact, reservoirs represent a major source of consumptive water use. We don’t have enough water to keep current full, and most suitable reservoir sites have already been developed.
In addition, policies to control water consumption via the use of RFR (radio frequency radiation) powering wireless smart meters are examples of poorly conceived tech infrastructure solutions that will
a) create more data and the demand for more data processing and data centers
b) increase surveillance
c) create a built-in cycle of obsolescence and increased e-waste for meters and infrastructure
d) discriminate against certain classes of customers on the basis of health
e) consume resources unnecessarily
See more here:
Smartmeters Health and Safety FAQs – Environmental Health Trust (ehtrust.org)
Physicians for Safe Technology | Smart Meter Radiation Health Effects (mdsafetech.org)
(In addition, scrutiny of the impact of microwaves on the structure of water, for example from installing antennas on water towers, increasing the ambient RFR exposures, and installing wireless meters and banks of meters, is inadequate, despite reported harm. )
Previously, fire alarms only sent a signal when there was a fire. We are now creating over-lapping mesh networks that continuously send signals, unnecessarily, and without regard for safety or for the environment. Smart meters are often promoted for their ability to detect leaks. They do not have to transmit continuously in order to address this need.
Closing Thoughts:
When we don’t account for all forms of energy and water consumption, we do not have accurate data, despite the enormous growth and grotesque water and energy consumption of data farms.
When we give military and security NSA interests a free pass for planetary environmental impacts, our data and decision making is corrupted.
When we attribute blame to farmers and climate, it seems like someone else’s distant problem, and responsibility.
The decision to diminish water access for farming and to ignore water use by data farms will have consequences.
Our consumption of data is being insidiously increased, without our conscious awareness, including by the introduction of AI.
Our unexamined, increased demand for data, (activated by the lockdowns and trauma) offers one avenue for personal decision-making and empowerment. We can begin to embrace digital sobriety, needed in nearly all ways, nearly everywhere, and needed now. Rather than blanketing the planet and its atmosphere with wi-fi, we need a tsunami of awareness that recognizes the Rights of Nature, including the Great Salt Lake’s rights.
Highly Recommended:
Financial Sense Newshour’s Jim Puplava speaks with Mark Mills, Executive Director of the National Center for Energy Analytics about the collision between policy goals and economic realities when it comes to green energy, growing electricity demand from AI, data centers, and onshoring of semiconductor manufacturing. Mark explains how the US is not building enough electrical plants to keep up with demand, let alone enough wind and solar, so it will have to be provided by natural gas and coal, which is also seeing exploding demand globally. Mark also discusses the role of nuclear power, small modular reactors, and why places like Germany and California are seeing outright deindustrialization due to their aggressive green energy policies by not factoring in the difficulty, if not near impossibility, of switching away from the use of hydrocarbons and nuclear for electricity production. This is an eye-opening interview packed full of economic and financial market insights that you won’t want to miss with one of the smartest and brightest minds when it comes to understanding the current conflict between policy, energy, and reality.
https://podcasts.apple.com/us/podcast/financial-sense-r-newshour/id306759846?i=1000655229628 (1 hour 13 minutes)
(National Center for Energy Analytics Fact-based Perspectives on Energy “NCEA’s scholars are devoted to data-driven analyses of policies, plans, and technologies surrounding the supply and use of energy essential for human flourishing.”)
Related Article:
When Politics and Physics Collide The belief that mandates and massive subsidies can summon a world without fossil fuels is magical thinking. – Mark P. Mills
There’s another category error in analogizing energy tech to computer tech. Consider the IMF’s musings on smartphone substitution and energy substitution. Unshackling the personal phone from wired connections didn’t doom wires and cables for communications—instead, it created an enormous expansion in communications traffic that, collaterally, drove greater need for wires and cables.
[] neither political rhetoric nor financial largesse can make the impossible possible.
Be on the right side of history.