Close Menu
Aviation Analysis – Industry Travel NewsAviation Analysis – Industry Travel News
    Facebook X (Twitter) Instagram
    Facebook X (Twitter) Instagram
    Aviation Analysis – Industry Travel NewsAviation Analysis – Industry Travel News
    Subscribe
    • Home
    • Top News
    • World
    • Economy
    • Science
    • Tech
    • Sport
    • Entertainment
    • Contact Form
    Aviation Analysis – Industry Travel NewsAviation Analysis – Industry Travel News
    Home»Tech»Thinking requires a lot of energy – Artificial intelligence consumes a lot of energy – News
    Tech

    Thinking requires a lot of energy – Artificial intelligence consumes a lot of energy – News

    Theodore MeeksBy Theodore MeeksOctober 31, 2023No Comments3 Mins Read
    Share Facebook Twitter Pinterest LinkedIn Tumblr Email
    Thinking requires a lot of energy – Artificial intelligence consumes a lot of energy – News
    Share
    Facebook Twitter LinkedIn Pinterest Email Copy Link
    Contents

    The increase in AI applications is growing rapidly – ​​and with it, energy consumption. What solutions are there for this?

    what is he talking about? Artificial intelligence (AI) aims to help when humans are too slow, work is too tedious, or when a machine is too precise. But when using AI, we often forget that training and running AI computer systems requires huge amounts of energy. Because computing power always requires electricity. The more complex the calculations, the more power servers and computers need. And: AI calculations are always complex – after all, the user expects a specific, detailed and perhaps personal answer to a specific question.

    The smarter the answer, the more computing power is required and thus the more energy is consumed.

    Why so much electricity? Each user query, for example in ChatGPT, consumes a lot of energy – after all, this runs large-scale computing operations on dozens of servers. But before the system is ready to provide the smartest possible answers, it must be trained. “In order to train a language model, for example, it has to perform calculations on thousands of billions of words,” says Guido Berger, digital editor of the SRF. And: “The smarter the answer, the more computing power is required and thus the more energy is consumed.” Power consumption is higher for photo or video applications.

    It’s not entirely clear whether AI providers will gain anything at all.

    Who pays for this electricity? “AI providers are currently burning through their investment dollars in their own data centers,” Guido Berger points out. It is still quite open whether providers will one day be able to charge users for electricity consumption – for example by making inquiries that cost something. “It’s also unclear whether AI providers will gain anything at all,” Berger says. All these uncertainties make it difficult to predict the number of future queries, and thus the energy consumption of future AI. Because it seems obvious that the more expensive it is for the user, the fewer requests there will be for AI services.

    Extremely high power consumption


    Open the box
    Close the box

    legend:

    Keystone / Gaetan Bali

    A Dutch scientist has found that just running the most popular AI app, ChatGPT, uses as much electricity as 40,000 homes. And this is just one of today’s AI offerings that is increasingly difficult to control. Therefore, no one really knows the total amount of electricity used in AI. But experts are sure that consumption will rise quickly – all over the world. According to expert Ralf Herbrech from the Hasso Plattner Institute in Potsdam, all computers in the world, including data centers, currently consume about eight percent of the total electricity produced. Herbrich told the German News Agency (dpa): “There are estimates that consumption may rise to 30 percent in the next few years.”

    Reduce energy consumption? Because AI providers have to pay for computing power and electricity themselves, they have a strong interest in reducing this. Accordingly, there is a lot of research being done in the field of efficiency. One way might be to use smaller AI language models. You will use training data, computing power and therefore much less electricity. Status: Smaller models have yet to be good enough to be able to perform their specific applications. Berger also believes it is unrealistic that electricity consumption will increase dramatically as a result of artificial intelligence, as some experts predict. Because: “Nobody can pay for that anymore.”

    Share. Facebook Twitter Pinterest LinkedIn Tumblr Email
    Theodore Meeks

    Lifelong foodaholic. Professional twitter expert. Organizer. Award-winning internet geek. Coffee advocate.

    Related Posts

    Neodymium Magnet Uses and Safety Tips for 2025

    October 4, 2025

    Halifax Stanfield Airport Pilots Passenger Service Robots in Bid to Boost Traveler Experience

    August 25, 2025

    Battery miracle in test – HP Omnibook X AI: If it takes longer again

    August 29, 2024
    Navigate
    • Home
    • Top News
    • World
    • Economy
    • Science
    • Tech
    • Sport
    • Entertainment
    • Contact Form
    Pages
    • About Us
    • DMCA
    • Contact Form
    • Privacy Policy
    • Editorial Policy
    STAY UPTODATE

    Get the Latest News With Aviationanalysis.net

    OFFICE

    X. Herald Inc.
    114 5th Ave New York,
    NY 10011, United States

    QUERIES?

    Do you have any queries? Feel free to contact us via our Contact Form

    Visit Our Office

    X. Herald Inc.
    114 5th Ave New York,
    NY 10011, United States

    • About Us
    • DMCA
    • Contact Form
    • Privacy Policy
    • Editorial Policy
    © 2026 ThemeSphere. Designed by ThemeSphere.

    Type above and press Enter to search. Press Esc to cancel.