← Back to articles
Green Software Engineering, Programming Paradigms & Performance, Resource Efficiency Metrics, Sustainable Infrastructure & Frameworks

Comprehensive Evaluation of Software Energy Consumption: An Analysis of the Energy, Time, and Memory Triad Across Programming Languages

Gustavo Hammerschmidt · 18:52 06/Mar/2026 · 20 min
69 views

Post Cover Image

The transition toward sustainable computing has necessitated a fundamental shift in how the performance of software systems is quantified. Historically, the optimization of software was viewed almost exclusively through the lens of execution time, with memory usage serving as a secondary constraint. However, the emergence of global concerns regarding carbon footprints and the operational costs of massive data centers has elevated energy consumption to a primary specification alongside speed and memory footprint. The specific inquiry into the comparative energy efficiency of various programming languages was catalyzed by a landmark study that evaluated twenty-seven languages against three distinct specifications: energy consumption, execution time, and memory usage. This analysis examines the foundational research and the subsequent critical re-evaluations that have shaped the modern understanding of "green" software engineering.

The Foundations of Green Software Metrics

The pursuit of energy efficiency in software begins with a clear definition of the variables involved. The total energy consumed by a computer program is the integral of power over time, typically simplified to the product of average power draw and the total duration of the execution. This relationship is expressed as:

\[Energy(Joules) = Power(Watts) \times Time(Seconds)\]

While this equation suggests that reducing execution time will directly reduce energy, the power variable is not a constant across different languages or implementation strategies. Power draw is influenced by hardware utilization, including CPU cycles, cache activity, and memory bus traffic. Therefore, a language that executes more slowly but at a significantly lower power level could, in theory, be more energy-efficient than a faster, more power-hungry alternative. This potential for a "speed-energy trade-off" provided the primary motivation for the empirical studies conducted by Rui Pereira and colleagues at the Green Software Laboratory.

The initial research, presented at the 2017 ACM SIGPLAN International Conference on Software Language Engineering (SLE), sought to provide software engineers with data-driven support for selecting languages when energy is a concern. By using ten diverse programming problems from the Computer Language Benchmark Game (CLBG), the researchers monitored the performance of twenty-seven languages. These problems range from CPU-intensive simulations to memory-heavy tree manipulations, ensuring that the resulting rankings are not skewed by a single type of computational workload.

Comparative Analysis of the Three Specifications

The centerpiece of the 2017 study and its 2021 update is the comparative ranking of languages across energy, time, and memory usage. The findings demonstrate that while there is a strong correlation between speed and energy efficiency, the relationship is not absolute, especially when moving across different programming paradigms and execution types.

Global Efficiency Rankings

The languages were categorized into three execution types: compiled, virtual machine (VM), and interpreted. The data consistently shows that compiled languages are the most efficient across all three specifications, while interpreted languages occupy the bottom tier. The following table summarizes the overall results for a cross-section of the languages studied, normalized against the performance of C.

Language Execution Type Energy (Normalized) Time (Normalized) Memory (Normalized)
C Compiled 1.00 1.00 1.00
Rust Compiled 1.03 1.04 1.03
C++ Compiled 1.34 1.56 1.03
Ada Compiled 1.70 1.85 1.47
Java Virtual Machine 1.98 1.89 6.01
Pascal Compiled 2.14 3.02 1.05
Go Compiled 3.23 2.83 1.37
C# Virtual Machine 3.14 3.14 5.02
JavaScript Interpreted 4.45 6.52 4.59
Dart Interpreted 3.83 4.62 5.83
PHP Interpreted 29.30 27.64 5.06
Python Interpreted 75.88 71.90 2.80
Ruby Interpreted 69.91 59.34 3.97
Lua Interpreted 45.98 82.91 1.06

The results indicate that the top five languages—C, Rust, C++, Ada, and Java—maintain their relative positions whether sorted by energy or by execution time. This suggests that for high-performance applications, speed remains the most reliable proxy for energy efficiency. However, the memory specification deviates significantly from this pattern. For instance, Java ranks 5th in energy and 4th in time, but drops to a much lower position in memory usage due to the overhead of the Java Virtual Machine (JVM).

The Dynamics of Energy and Execution Time

The high correlation between energy and time is particularly evident in compiled languages. Because these languages translate directly to machine code, they allow for aggressive compiler optimizations that minimize both the number of instructions executed and the time the CPU remains in a high-power state. The energy consumed by these languages is largely a function of how long they occupy the hardware.

Nevertheless, there are benchmarks where this correlation breaks down. In the regex-redux benchmark, some interpreted languages like JavaScript and PHP performed surprisingly well in energy efficiency despite their overall low rankings. This is often due to the fact that these languages use highly optimized C libraries for string processing, effectively offloading the heavy computational work to a more efficient execution environment. Furthermore, the study noted that a faster language is not always a greener one; in the fasta benchmark, Fortran ranked 2nd in energy efficiency but was only 8th in execution time. This implies that Fortran utilized the CPU and memory bus in a way that resulted in lower power draw, compensating for its longer execution duration.

Memory Usage and DRAM Energy Consumption

The third specification, memory, was initially evaluated using peak memory usage (Resident Set Size). The researchers found that peak memory usage showed almost no significant correlation with memory energy consumption. A language could reserve a large block of memory at the start of execution and hold it throughout, resulting in a high peak usage but not necessarily high energy consumption if the memory is not actively being read from or written to.

In the 2021 update, the focus shifted to continuous memory usage over time. This revealed a much stronger relationship: the total energy consumed by the DRAM is directly proportional to the total memory activity throughout the program's lifecycle. The findings suggest that it is more energy-efficient to store high amounts of memory at once and release it quickly than to maintain continuous, fluctuating memory usage throughout the execution.

Execution Type Avg Energy (Joules) Avg DRAM Energy Portion
Compiled 120 11.06%
Virtual Machine 576 13.50%
Interpreted 2365 15.20%

The table above illustrates that as we move from compiled to interpreted languages, not only does the total energy increase, but the proportion of energy assigned to the DRAM also grows. This is largely due to the increased memory traffic required for interpreting code or managing virtual machine states, which keeps the memory controller and DRAM modules active for longer periods.

Programming Paradigms and Their Environmental Impact

The research also analyzed the impact of programming paradigms—imperative, functional, object-oriented, and scripting—on energy consumption. This analysis provides a more nuanced understanding of how language design philosophy influences hardware utilization.

Imperative and Object-Oriented Languages

Imperative languages like C and Pascal and object-oriented languages like C++ and Java generally demonstrate superior energy efficiency. This is attributed to their ability to provide developers with fine-grained control over system resources and memory layout. Java, in particular, stands out as the most efficient language that does not compile to native code. Despite its high memory footprint, the JVM's Just-In-Time (JIT) compilation and mature garbage collection algorithms allow it to achieve a balance of performance and energy usage that rivals some compiled languages.

Functional Languages

Functional languages like Haskell, OCaml, and Erlang present a more varied energy profile. While Haskell and OCaml can be quite fast, their reliance on immutability and recursion can lead to higher memory traffic and more frequent garbage collection cycles, which increases DRAM energy consumption. Interestingly, the studies found that in some functional languages, "faster is not always greener". For certain benchmarks, a more idiomatic but slower functional approach might consume less energy than an optimized imperative-style approach implemented within the same language, due to differences in how the compiler handles higher-order functions and lazy evaluation.

Scripting and Interpreted Languages

Scripting languages, which are almost entirely interpreted, are the least energy-efficient category. Python, the most popular language in this category, was found to consume up to 75 times more energy than C in the original 2017 study. This massive discrepancy is the result of the interpreter overhead; every line of Python code must be parsed and executed by the CPython interpreter, which involves significantly more CPU cycles than executing a pre-compiled binary. However, the use of alternative interpreters like PyPy can improve Python's energy efficiency by more than 50%, reducing its overhead to approximately 21 times that of C.

Server-Side Efficiency: Frameworks and Protocols

The user's interest in server programs highlights the importance of evaluating energy consumption in the context of networked applications and data centers. In these environments, the choice of a programming language is often coupled with the selection of a communication framework and an application architecture.

Multi-Language Offloading and Server Processes

A 2025 study focused on the performance and energy efficiency of multi-language offloading scenarios, where tasks are sent from a client to a server process developed in a different language. The study evaluated Go, C++, and Java as server-side languages and compared different Remote Procedure Call (RPC) frameworks.

The findings suggested that Go is the optimal language for building server processes among the three tested. Go's lightweight concurrency model (goroutines) and efficient garbage collector make it particularly well-suited for handling the high-throughput, low-latency requirements of modern server programs.

RPC Frameworks: Apache Thrift vs. gRPC

The study also provided critical insights into the energy cost of communication protocols. While gRPC is a popular choice for microservices due to its feature richness and support for bidirectional streaming, it was found to be less energy-efficient than Apache Thrift in many scenarios.

Framework Efficiency Metric Ranking Position
Apache Thrift Fastest in 83% of scenarios Most energy-efficient in 66% of cases 23
gRPC (FlatBuffers) Secondary Performance Not explicitly ranked in the top 23
gRPC (Protobuf) Slowest in 81% of scenarios Most energy-consuming in 78% of cases 23

The superior performance of Apache Thrift is attributed to its simpler protocol structure and more efficient binary serialization. In contrast, gRPC uses HTTP/2 at the transport layer, which introduces framing overhead and complex metadata exchange (headers and trailers). While these features are essential for modern cloud-native capabilities like flow control and load balancing, they come at a measurable energy cost.

Web Application Frameworks

In the context of web applications, research comparing frameworks like Django (Python), Express (JavaScript), Laravel (PHP), and Spring Boot (Java) has identified clear winners in both response time and energy efficiency.

  • Express and Spring Boot: These frameworks emerged as the most efficient, providing the lowest energy consumption per request.
  • Django and Laravel: These frameworks exhibited significantly higher energy usage, directly reflecting the inefficiencies of the interpreted languages they are built upon.
  • Concurrent Users: As the number of concurrent users increases, the efficiency gap between these frameworks widens, with Spring Boot and Express maintaining more stable energy profiles under heavy load.

Modern Java frameworks like Quarkus have demonstrated even greater potential for server-side efficiency. By utilizing native compilation via GraalVM, Quarkus can reduce memory usage by up to 80% and CPU usage by 95% compared to traditional Spring Boot applications, significantly improving its energy profile for containerized deployments.

Energy Consumption in Artificial Intelligence

The rapid growth of AI and machine learning (ML) has made the energy efficiency of these workloads a major environmental concern. A 2025 study evaluated five programming languages and seven AI algorithms during both training and inference phases.

Language Impact on ML Sustainability

The study found that the choice of programming language can have up to a 54x impact on energy consumption during AI tasks. Compiled languages like C++ consistently outperformed interpreted languages like Python and MATLAB, which is significant given the ubiquity of Python in the AI field.

AI Algorithm C++ Energy (107 J) Java Energy (107 J) Python Energy (107 J) R Energy (107 J)
KNN 2.67 0.34 1.61 0.65
SVC 4.87 0.38 366.85 163.10
Random Forest 13.54 80.06 110.43 209.05
Logistic Regression 1.91 38.31 14.82 996.18
All Algorithms (Sum) 37.08 152.54 578.78 1401.23

The table reveals that the most efficient language depends heavily on the specific algorithm being used. For instance, Java was remarkably efficient for training KNN and SVC models in this specific study, while C++ was the overall leader across all seven algorithms. The research emphasizes that algorithm implementation might be the most determining factor in Green AI, regardless of the language used.

Training vs. Inference

The energy profile of a language also differs between the training and inference phases. While training is often a one-time (or periodic) high-energy event, inference is performed continuously for the lifetime of a model. The study found that choosing the right language for inference can reduce energy costs by over 99% compared to less efficient choices, highlighting the importance of deployment-time language selection.

User Interface and Front-End Energy Efficiency

Energy efficiency is not restricted to the server; the front-end design and the choice of automation testing frameworks also contribute to the total power draw of a digital system.

Front-End Design and OLED Technology

Research indicates that UI design choices, such as color schemes and animations, have a measurable impact on the battery life of mobile devices. For devices with OLED displays, which illuminate each pixel individually, dark mode can reduce energy consumption by up to 72%. On LCD screens, however, the energy impact of dark mode is negligible because the backlight remains on regardless of the pixel color.

UI Automation Testing Frameworks

The choice of frameworks used for automated browser testing also introduces energy variability. A study comparing Nightwatch, Playwright, Puppeteer, and Selenium found that energy costs vary significantly by both the framework and the UI action being performed.

  • Puppeteer: Found to be the most efficient for left-click, right-click, double-click, checkbox, and input-text actions.
  • Selenium: Most efficient for browser refresh and scroll actions.
  • Nightwatch: Consistently exhibited the highest energy usage across most actions.
  • Most Intensive Actions: Drag-and-drop and input-text emerged as the most energy-intensive UI actions.

The energy cost of performing the same action could vary by up to a factor of six depending on the framework, which suggests that for large-scale testing environments, the selection of an automation tool has significant environmental and operational implications.

Critical Re-evaluation: It’s Not Easy Being Green

While the rankings provided by Pereira et al. have been widely influential, they have also faced rigorous academic critique. The paper "It’s Not Easy Being Green" identifies several methodological flaws and provides a causal model to explain the energy differences observed in prior work.

Causal Misinterpretation and Implementation Bias

The primary critique is that many researchers have interpreted statistical associations as causal relationships—the idea that the language itself directly causes the energy consumption. However, the critique argues that once confounding factors are controlled, energy is almost always directly proportional to execution time.

The "energy anomalies" observed in previous rankings—such as a faster language appearing less efficient—are often the result of differences in parallelism and CPU utilization. A program that uses all 16 cores of a CPU will draw significantly more power per second than a single-threaded program, even if it finishes much faster. When the researchers controlled for these factors by forcing benchmarks to run on a single core and normalizing for time, the notable discrepancies between the languages "vanished".

The TypeScript-JavaScript and C-C++ Discrepancies

The critique highlights specific counter-intuitive data points in the original study. For example, Pereira et al. reported that C++ was 34% less energy-efficient and 56% slower than C, despite sharing the same compiler backend and optimizations. Similarly, TypeScript was reported as 4.8x less energy-efficient than JavaScript, even though TypeScript transpiles into JavaScript and runs on the same Node.js runtime. These differences are more likely due to variations in the specific algorithm implementations used in the Computer Language Benchmark Game than to the languages themselves.

Residents Set Size vs. Memory Activity

The critique also challenges the use of Resident Set Size (RSS) as a proxy for memory activity. It argues that RSS includes inactive memory and fails to account for cache activity and Last Level Cache (LLC) misses, which are the true drivers of memory-related power draw. This perspective aligns with the 2021 Pereira update, which shifted focus toward continuous memory activity as a more reliable metric.

Practical Guidance for Professional Software Engineering

The synthesis of this extensive body of research provides a clear framework for making energy-aware decisions in software development. For professional peers in the domain of green software engineering, several high-level takeaways are evident.

Selecting a Language for Efficiency

When energy efficiency is a primary specification, compiled languages remain the gold standard. C and Rust offer the highest efficiency, with Rust providing the added benefit of memory safety without the performance penalties of a garbage collector. For enterprise-level applications where a virtual machine is required, Java remains the most efficient choice, particularly when paired with modern frameworks like Quarkus or when running on highly optimized JVMs.

Optimization Priorities

The data suggests that the single most effective way to reduce a program's energy consumption is to minimize its execution time. Developers should focus on algorithmic efficiency and hardware utilization. However, special attention must be paid to memory activity. Reducing the total amount of data shuffled between the CPU and DRAM is essential for minimizing the DRAM energy portion, which can account for up to 31% of total energy consumption.

Server and Cloud Strategy

For server-side processes, Go is an exceptionally efficient choice due to its balance of speed, concurrency, and energy usage. When designing inter-service communications, the use of Apache Thrift should be considered over gRPC if message-level energy efficiency is more critical than the features provided by the HTTP/2 transport. In AI and ML contexts, while Python is indispensable for development, the actual training and inference should be offloaded to optimized C++ or Java libraries whenever possible to avoid the massive energy overhead of the interpreter.

Conclusion: The Future of Energy-Aware Software Design

The investigation into energy efficiency across programming languages has evolved from a simple ranking of 27 languages to a complex discipline involving causal models, hardware performance counters, and framework evaluations. The original "3 specs" of energy, time, and memory provided a vital starting point that forced the software engineering community to confront the environmental impact of its choices.

While the direct correlation between speed and energy remains the most dominant trend, the nuances of memory activity, garbage collection, and communication protocols demonstrate that green coding is a multi-dimensional challenge. The critiques of the early rankings emphasize that software energy consumption is not just about the syntax or semantics of a language, but about the quality of the implementation, the number of active cores, and the efficiency of the underlying runtime.

As digital systems continue to consume an increasing share of the world's electricity, the ability to design and deploy energy-efficient software will become a defining skill for the next generation of engineers. By leveraging the data from language rankings, RPC comparisons, and framework studies, the industry can move toward a more sustainable future where performance and environmental responsibility are viewed as two sides of the same coin. The journey toward green software is not just about choosing a "green" language; it is about an enduring commitment to optimization, measurement, and the reduction of computational waste across every layer of the technology stack.

Works cited