Close Menu
TechurzTechurz

    Subscribe to Updates

    Get the latest creative news from FooBar about art, design and business.

    What's Hot

    The OnePlus 12 is still on sale for $300 off – but time is running out

    October 15, 2025

    Coinbase boosts investment in India’s CoinDCX, valuing exchange at $2.45B

    October 15, 2025

    Was ist ein Keylogger?

    October 15, 2025
    Facebook X (Twitter) Instagram
    Trending
    • The OnePlus 12 is still on sale for $300 off – but time is running out
    • Coinbase boosts investment in India’s CoinDCX, valuing exchange at $2.45B
    • Was ist ein Keylogger?
    • A minority of businesses have won big with AI. What are they doing right?
    • New Pixnapping Android Flaw Lets Rogue Apps Steal 2FA Codes Without Permissions
    • CISOs must rethink the tabletop, as 57% of incidents have never been rehearsed
    • A New Attack Lets Hackers Steal 2-Factor Authentication Codes From Android Phones
    • Leaving Windows 10 today? How to clear your new Windows 11 PC cache (and start fresh)
    Facebook X (Twitter) Instagram Pinterest Vimeo
    TechurzTechurz
    • Home
    • AI
    • Apps
    • News
    • Guides
    • Opinion
    • Reviews
    • Security
    • Startups
    TechurzTechurz
    Home»Apps»Why don’t AI companies talk about energy usage?
    Apps

    Why don’t AI companies talk about energy usage?

    TechurzBy TechurzMay 28, 2025No Comments5 Mins Read
    Share Facebook Twitter Pinterest LinkedIn Tumblr Reddit Telegram Email
    Sam Altman and ChatGPT
    Share
    Facebook Twitter LinkedIn Pinterest Email


    If you’ve used ChatGPT recently (and statistically, you probably have) you’re part of a global trend. OpenAI’s chatbot is estimated to be the fifth most visited website in the world, with more than 400 million users a week.

    And that’s just one AI tool. As generative AI becomes embedded into apps, search engines, workplaces and daily habits, our interactions with large language models (LLMs) like ChatGPT, Google Gemini and Claude are only increasing.

    We’ve become more aware of AI’s risks, from misinformation and deepfakes to surveillance and emotional dependence. But one of the biggest is AI’s environmental impact.


    You may like

    Running LLMs requires enormous amounts of electricity and water. These models consume energy not just during training, when they absorb and organize vast volumes of data, but every time you ask a question. That’s billions of queries a day, each one demanding computational power and adding to a growing environmental cost.

    Why we still don’t know how much energy AI really uses

    The truth is, we don’t know how much energy AI really uses and that’s a big problem.

    Unlike most industries, AI companies aren’t required to report the environmental footprint of their models. There’s no standardized regulation or reporting framework in place for the energy use or carbon emissions tied specifically to AI systems.

    There are a few reasons for that. First, the technology is still relatively new, so the infrastructure for this sort of regulation and reporting hasn’t caught up.

    Sign up for breaking news, reviews, opinion, top tech deals, and more.

    But tech companies also haven’t pushed for it. That’s partly because AI is a fiercely competitive space. Which means that sharing energy data could inadvertently reveal details about a model’s size, architecture or efficiency.

    It’s also technically difficult. AI systems are spread across vast server farms, multiple teams and shared infrastructure, which makes it hard to isolate and track usage.

    Then there’s the optics. Companies heavily invested in the narrative that AI will only do us all vast amounts of good don’t want to be linked to sky-high emissions or the guzzling of finite resources.

    So, with little transparency, researchers and journalists are left to estimate. And those estimates are alarming.

    Here’s what we do know about AI’s energy use

    Many credible estimates have been made over the past few years. But a recent report from MIT Technology Review offers one of the clearest pictures yet of AI’s growing appetite for electricity and water.

    The report is filled with striking comparisons. For example, generating a 5 second AI video might use as much energy as running a microwave for an hour.

    Even simple chatbot replies can vary widely in energy consumption. One estimate puts a basic reply at anywhere between 114 and 6,700 joules, which is equivalent to running a microwave for between half a second and eight seconds. But as tasks become more complex – like those that involve images or video – the energy cost rises dramatically.

    According to the report, the bigger picture is even more concerning. In 2024, US data centers consumed around 200 terawatt-hours of electricity, which is roughly the same as Thailand’s entire annual consumption.

    And that number is climbing fast. By 2028, researchers estimate that AI-related electricity use alone could reach up to 326 terawatt-hours per year. That’s more than all of the current data center usage in the US and enough to power more than 22% of American households annually.

    In carbon terms, that’s like the equivalent to driving more than 300 billion miles, which works out at about 1,600 round trips to the sun.

    It’s not just about power either. AI infrastructure also consumes vast amounts of water, primarily for cooling. In some regions, this adds strain to already stretched water supplies, which is a serious concern during heatwaves and droughts.

    Experts say that one of the biggest challenges here is scale. Even if we had precise figures today, we’d still be underestimating the problem in a year or even in a month’s time.

    That’s because the way we use AI is evolving rapidly. Generative models are being built into everyday tools, from writing apps and customer service bots to photo-editing software and search engines. As this adoption accelerates, without a clear understanding of the costs, the environmental impact is likely to spiral much, much faster than we ever expected.

    (Image credit: Shutterstock/Sashkin)

    Here’s what needs to change and who’s stepping up

    The good news is there is growing momentum to make AI more accountable for its environmental footprint. But right now, transparency is the exception and not the rule.

    For example, the Green Software Foundation (GSF), a global non-profit organization including Microsoft, Cisco, Siemens, Google and other companies, is one of the groups leading the charge.

    Through its Green AI Committee, the GSF is developing sustainability standards that are designed specifically for AI. This includes lifecycle carbon accounting, open-source tools for tracking energy usage and real-time carbon intensity metrics, which are all aimed at making AI’s environmental impact measurable, reportable and (hopefully) manageable.

    Policy frameworks are also taking shape in some regions. For example, the EU’s AI Act encourages sustainability through risk assessments. While the UK’s AI Opportunities Action Plan and the British Standards Institution (BSI) are creating technical guidance on how to measure and report AI’s carbon footprint. These are early steps but they could help to inform future regulation.

    Some AI companies are taking steps in the right direction too, investing in renewable energy, researching more efficient training methods and developing improved cooling infrastructure. But these improvements aren’t standard across the industry yet and there’s still no broadly accepted approach.

    That’s why transparency matters. Without clear and open data about how much energy these systems consume, we can’t accurately assess the cost of AI or hold the right companies accountable. We certainly can’t build more sustainable policy or infrastructure around it either. Tech companies can’t keep asking us to trust in the future of AI while hiding the true cost of running it.

    You might also like

    Companies dont energy talk usage
    Share. Facebook Twitter Pinterest LinkedIn Tumblr Email
    Previous ArticleJPMorgan Is Opening ‘Affluent Banking’ Centers. Here’s Where.
    Next Article 4 ways business leaders are using AI to solve problems and create real value
    Techurz
    • Website

    Related Posts

    Opinion

    Can AI companies turn brainrot into revenue?

    October 3, 2025
    Security

    Hackers stole 1 billion records from Salesforce customer databases with this simple trick – don’t fall for it

    October 3, 2025
    Security

    Don’t drink or drive, say cyberattackers

    October 3, 2025
    Add A Comment
    Leave A Reply Cancel Reply

    Top Posts

    The Reason Murderbot’s Tone Feels Off

    May 14, 20259 Views

    Start Saving Now: An iPhone 17 Pro Price Hike Is Likely, Says New Report

    August 17, 20258 Views

    CNET’s Daily Tariff Price Tracker: I’m Keeping Tabs on Changes as Trump’s Trade Policies Shift

    May 27, 20258 Views
    Stay In Touch
    • Facebook
    • YouTube
    • TikTok
    • WhatsApp
    • Twitter
    • Instagram
    Latest Reviews

    Subscribe to Updates

    Get the latest tech news from FooBar about tech, design and biz.

    Most Popular

    The Reason Murderbot’s Tone Feels Off

    May 14, 20259 Views

    Start Saving Now: An iPhone 17 Pro Price Hike Is Likely, Says New Report

    August 17, 20258 Views

    CNET’s Daily Tariff Price Tracker: I’m Keeping Tabs on Changes as Trump’s Trade Policies Shift

    May 27, 20258 Views
    Our Picks

    The OnePlus 12 is still on sale for $300 off – but time is running out

    October 15, 2025

    Coinbase boosts investment in India’s CoinDCX, valuing exchange at $2.45B

    October 15, 2025

    Was ist ein Keylogger?

    October 15, 2025

    Subscribe to Updates

    Get the latest creative news from FooBar about art, design and business.

    Facebook X (Twitter) Instagram Pinterest
    • About Us
    • Contact Us
    • Privacy Policy
    • Terms and Conditions
    • Disclaimer
    © 2025 techurz. Designed by Pro.

    Type above and press Enter to search. Press Esc to cancel.