Dual citizen of 🇺🇸 USA & Croatia 🇭🇷
Technologist & Businessman ¡ Independent voter
Cutting expenses ¡ For equitable management
andrew@andrewwerner.com

Academic Foundation

Southern New Hampshire University (SNHU)

B.S. in Environmental Science (Data Analytics) | 2023 | GPA: 4.0

SNHU is accredited by the New England Commission of Higher Education which is the same accreditation body that accredits Harvard University and The Massachussets Institute of Technology (MIT). I transferred from The Universty of Connecticut's Special Program in Pre-Pharmacy where grief from the passing of a close friend and the COVID-19 pandemic made my studies difficult.

For geospatial data processing, I love Cloud-Optimized Geospatial Formats and I have found Development Seed's zine to be very educational. Concerning the environmental footprint of AI, I trust this article funded by the Tarbell Center for AI Journalism. I wrote these words on the matter some time ago: Most data centers are inefficient. They consume massive amounts of energy and water, straining our resources. I’ve found some developing solutions we may see soon:

As a preface, energy powers data centers, generating that energy often consumes water, and water is also used to cool data centers.

Compute & Storage: Use efficient processors and storage. Instead of computationally expensive systolic-array-based AI accelerators, neuromorphic processing is far more efficient. Similarly, ditch power-hungry, heat-spewing hard drives for SSDs—especially EDSFF form factors, which offer higher storage density in a smaller footprint, cutting energy use and cooling needs. An aside: magnetic tape is still needed for archiving cold data. Even CERN still uses tape. Hard drives and SSDs can only retain data for a couple years. LTO tapes, when stored upright at a steady temperature between 60°F and 77°F (15°C - 25°C) and relative humidity (RH) between 20% and 50% can last decades. Large fluctuations are more damaging than a steady, slightly sub-optimal temperature.

Cooling: Build data centers in colder climates, and adopt efficient cooling technologies. Two-phase immersion cooling fascinates me. It’s wild to see expensive computers submerged in what looks like water, but it’s actually a non-conductive (dielectric) refrigerant. Submerged in a reservoir, the computers heat up, the refrigerant boils and evaporates, the gas cools on a condenser back into a liquid, then returns to the reservoir. Some older refrigerants were per- and polyfluoroalkyl substances (PFAS) which are terrible for the environment, but regulations spurred safer and more sustainable alternatives like hydrofluoroolefins (HFOs). These still contain fluorine, but fluorine is found in most modern refrigerants due to having chemical properties that are generally well suited for transferring heat. The European Union’s push (Regulation (EU) 2024/573) to phase out hydrofluorocarbons (HFCs) by 2050 worried me. Though, HFOs are not included in the EU’s phase-out initiative, nor the EU PFAS Restriction Proposal (REACH). Instead, controls are introduced based on global warming potential (GWP) thresholds, and thankfully HFOs have a low GWP.

Energy: Hot take: some molten salt reactors (MSRs) don’t need water for cooling and can deliver the energy exascale data centers demand. It’s nuclear, but when done right, it is safe and pretty clean. If you’re anti-uranium, thorium is an excellent alternative that thrives in MSRs. Either way, the fuel can be recycled to reduce nuclear waste buildup: about 96% of uranium fuel is recyclable, and thorium fuel offers even greater recyclability, with nearly all of it being reusable.

Beyond this, software should be efficient, and waste heat from data centers should be repurposed. While heat pipes efficiently transfer heat over short to moderate distances, they are not ideal for long-range transfer. Given this, a viable option for repurposing waste heat is to supply warmth to nearby buildings including ancillary facilities on the data center campus and local communities. Other options exist, but using heat for warmth makes sense to me.

Concerning fusion power plants, I’m really optimistic for the future of this planet. All the energy we need seems close and it is clean. I like TAE Technologies' ambitious plans. I think how it works is there is a particle accelerator that creates an ion beam that is neutralized to create heat (laser photodetachment neutralization methods from the Lawrence Berkeley and Lawrence Livermore Laboratories seems promising). Containing this immense heat in a Tokamak creates magnetic pressure, and the neutralized gas transitions into a plasma where atomic nuclei collide and fuse. Because TAE is targeting aneutronic Hydrogen-Boron fusion, the reaction produces charged particles rather than neutrons. This allows them to use direct energy conversion—capturing the charged particles magnetically to generate electricity directly, bypassing the need for traditional steam turbines entirely to provide abundant, clean energy. I don't think we have to fear catastrophic atmospheric ignition even in the unlikely event of a containment breach of the plasma. From my limited understanding, the atmosphere isn't dense enough for catastrophy. I think the event would be localized to the area around the reactor, but I'm not an expert. Edward Teller explored making a nuclear bomb capable of splitting continents, but the potential energy of our planet is vastly greater than any kinetic energy of our machinations. Nonetheless, the explosions and fallout of nuclear weapons are devastating. They make for effective deterrance, but I hate to see them used.