In 1981, North Carolina welcomed the semiconductor industry as a promise of jobs and modernization, investing millions in what leaders called a new industrial revolution. Yet behind the optimism, advocates were already asking a crucial question: what hidden costs would come with high-tech growth?

That question remains unanswered more than four decades later. From California’s Silicon Valley to North Carolina’s Research Triangle, the semiconductor industry has left a documented trail of chemical exposure, groundwater contamination, and worker health risks. Early warnings from groups like the Silicon Valley Toxics Coalition showed that what happens inside chip fabrication plants inevitably affects surrounding communities. But these lessons were largely ignored as states rushed to attract investment and outcompete one another.

Weak regulation, fragmented health and environmental data, and industry secrecy allowed harms to accumulate quietly. While injury statistics appeared low, physician reports told a different story—one of elevated chemical burns, reproductive hazards, and long-term illness. Preventive measures such as chemical disclosure, worker health monitoring, and continuous environmental oversight were known and feasible, yet rarely enforced.

Today, the same pattern is repeating on a much larger scale. The global race for AI chips and hyperscale data centers is driving massive construction projects that consume enormous amounts of water and energy, often in already stressed regions. Policies like the U.S. CHIPS and Science Act prioritize speed and competitiveness, once again treating environmental safeguards as obstacles rather than necessities.

The article written by Dee McCrorey and Chip Hughes examines how the semiconductor industry has repeatedly expanded without adequate safeguards, resulting in long-term environmental damage and worker health risks.

Click here to read the full article.