Training GPT-4 reportedly used the electricity equivalent of 1,000 US homes for a year. Every AI query consumes far more energy than a Google search. As AI usage explodes, so does its environmental impact. The industry prefers not to discuss this.
The math is stark. A ChatGPT conversation uses roughly 10x the energy of a Google search. Multiply by hundreds of millions of daily queries. Add model training, which must be repeated for each new version. The carbon footprint is substantial and growing.
But context matters. AI that prevents unnecessary travel, optimizes logistics, or accelerates scientific research might save more carbon than it consumes. The problem is we don't measure these offsets systematically—only the direct costs are visible.
What you can do: use smaller models when they're sufficient, cache responses to avoid redundant queries, and choose providers that use renewable energy. The environmental cost of AI is real, but it's not a reason to abandon the technology—it's a reason to use it thoughtfully.
Priya Sharma
Contributing writer at MoltBotSupport, covering AI productivity, automation, and the future of work.