• Insights
    • Blog
    • Perspectives
    • Webinars
  • Offerings
    • Strategic Innovation
    • Digital innovation
    • Product innovation
    • High-tech innovation
    • On-site consulting
  • Markets
    • Space & security
    • FMCG
    • Life sciences
    • Industry
  • Capabilities
    • AILab
    • DesignLab
    • DigitalLab
    • EmbeddedLab
    • FabLab
    • InnoLab
    • MechLab
    • OpenLab
    • OpticsLab
    • PhysicsLab
  • Technologies
    • Technology portfolio
    • IoT & sensors
    • AI & data science
    • Robotics & autonomy
    • Cooling, heating & fluidics
  • About
    • News
    • Our story
  • Careers
  • Contact
Verhaert Masters in InnovationVerhaert Masters in Innovation
Verhaert Masters in InnovationVerhaert Masters in Innovation
  • Insights
        • Blog
        • Perspectives
        • Webinars
        • FEATURED
          Report 'AI in R&D'
  • Offerings
        • Strategic
          innovation
        • Digital
          innovation
        • Product
          innovation
        • High-tech
          innovation
        • On-site
          consulting
        • FEATURED
          Innovation Academy
  • Markets
        • Space & defense
        • FMCG
        • Life sciences
        • Industry
  • Capabilities
        • AILab
        • DesignLab
        • DigitalLab
        • EmbeddedLab
        • FabLab
        • InnoLab
        • MechLab
        • OpenLab
        • OpticsLab
        • PhysicsLab
  • Technologies
        • IoT & sensors
        • AI & data science
        • Robotics & autonomy
        • Cooling, heating & fluidics
        • Optics
  • About
        • News
        • Our story
  • Careers
  • Contact

The future of energy-efficient AI systems

16 December 2022 Posted by Jente Somers Artificial intelligence

Did you know that training a large neural network produces up to 5 times more CO2 than an average car in its lifetime?! This showcases a huge problem: AI, in general, seems to consume a lot of power. Since we’re starting to collect more and more data annually, this problem will only continue to grow.

The future of energy-efficient AI systems

More data means more (and sometimes even bigger) models, which in turn increases energy consumption. We’re already aware of this rising problem, so we need to act NOW before it is too late. How do we do this?

1. We can start by optimizing the energy efficiency:

Connect the device to the cloud

In the cloud, hardware gets shared, meaning the power consumption gets centralized in one location. This gives us the opportunity to optimally use the hardware, optimize the power consumption and get large gains in return. Additionally, the necessary cooling for these large data centers can be optimized and the produced heat can even be recycled!

However, this poses scaling problems when the fleet of connected devices grows. To circumvent this, we can move the intelligence to the edge of the devices.

Reduce the energy consumption on the edge

For embedded or edge devices, the hardware doesn’t get shared. In this case, you need to maximize the ‘idle task’. Within the idle task, put the system in low power by using the low power level features of the hardware. Other possibilities are to reduce the complexity of the code or the number of computations.

2. Take into account the network quality:

Network-wise, not only the availability but also the quality (especially the latency) of the network should be considered with regard to power consumption. When the network connection is bad it takes the hardware longer to send the same amount of data, thus consuming more power. Research has even shown this relation is exponential!

Latency issue

So how can you tackle this latency problem? One possibility is to combine the data in batches before sending it to the cloud. An even better approach is to wait until the network connection is better, to reduce the required energy, to send these batches. You can also do parallel computing which is more efficient. But this ‘caching of data’ also brings problems along with it, i.e. you have to wait longer to get an answer back, so the Quality of Service (QoS) might be lower.

Both approaches discussed above come with advantages but also disadvantages. Each approach has its possibilities to help minimize energy usage, influencing the actual energy consumption.

3. If you can measure it, you can manage it!

So how do you decide which task to perform on either the edge or in the cloud? It might seem contradictory, but you can train an AI system with reinforcement learning to maximize the energy efficiency of each device (individually) throughout its lifetime. But in order to optimize something, you first need qualitative data and to get that data you need to measure whatever you want to optimize.

To make this concrete, let’s again look at optimizing the energy efficiency of a connected device. After deciding where to perform a task (edge or cloud), you can use the measurements of the power consumption to teach the model how good the decision was. This, however, also requires energy and time since you need to collect data and train the network. In the end, it’s a balancing exercise worth the effort, not only taking into account the technological but also the business side!

Balancing exercise of efforts

Key takeaways

  1. The energy efficiency can be optimized, either in the cloud or on the edge.
  2. The network quality, especially the latency, is also a key factor for energy consumption.
  3. With the use of AI, the optimal load balancing can be determined for each connected device (individually) to optimally use its battery.

Any questions or want to know more about this topic? Watch the complete presentation during our InnoDays webinar or get in touch!

Tags: Artificial intelligenceMachine & deep learning

You also might be interested in

Digital innovation in 2024: what’s in, what’s out

Digital innovation in 2024: what’s in, what’s out

Jan 30, 2024

From AI, custom platforms and clean tech, to AR and data-driven insights, take a look at the trends that we think are set to define this year’s innovation landscape.

Featured image - Perspective - The impact of the Covid-19 lockdown and its aftermath

The impact of the Covid-19 lockdown and its aftermath

Apr 8, 2020

The COVID-19 pandemic seems to be a crisis unlike any[...]

AI-enabled sustainability through process optimization

AI-enabled sustainability through process optimization

Nov 29, 2022

AI and machine learning enable sustainable production thanks to process optimization. Find out how in this InnoDays webinar.

Any questions? Curious how this can boost your business? Get in touch with Jente!
Latest artificial intelligence blogposts
  • 03/02/2025
    Garbage in, garbage out: The importance of data quality for AI and how it can be improved
  • 31/05/2024
    From ideation to innovation: AI’s magic touch in R&D
  • 21/05/2024
    AI or no AI, exploring the great divide
  • 15/05/2024
    From hype to reality: How AI is breaking down R&D barriers
  • 19/06/2023
    How AI is bridging the gap to safer roads
Verhaert group

Verhaert Masters in Innovation is a pioneering innovation group helping companies and entrepreneurs to innovate, creating new products, businesses and services.

Verhaert icon LinkedIn Verhaert icon YouTube Verhaert icon Instagram Verhaert icon phone Verhaert icon mail

Offerings
Markets
Capabilities
Technologies
Perspectives
Blogs
Webinars
About
News
Careers
Contact

© 1969-2025 • Verhaert New Products & Services NV • BE 0439.039.420 • Privacy policy • Terms of use