What a software company can do to reduce its carbon footprint

As the world becomes more aware of the impact of climate change, many companies are taking steps to reduce their carbon footprint, and the software companies are no exception


Vieolo is primarily a software company and there is a good chance that you, the reader of this article, are also working in the software industry.

The advancement of software has been a double-edged sword for the environment. On one hand, the software industry has collectively brought unmatched efficiency to many tasks and prevented the usage of trillions of pieces of paper. On the other hand, we have actively worked toward increasing the data consumption of average citizens which has, subsequently, caused their own environmental problems in the form of e-waste and energy usage.

Regardless of the overall positive effect of our work, we need to recognize that our operations and products have a negative impact on the environment, and we should all commit ourselves to doing our part to reduce our carbon footprint, however small.

We understand that our carbon footprint might be a drop in the ocean compared to other companies and industries, but, nevertheless, we have tried to adopt the following strategies since the start of 2022.

Prevention rather than offset

While offsetting our energy consumption through carbon credits or other mechanisms can be an important part of our overall sustainability strategy, reducing our energy consumption should be the top priority.

There are fundamental problems with the current approach to carbon offsetting. The effectiveness of carbon offsetting depends on the assumption that the offset project would not have happened without the offsetting investment. However, this is difficult to prove, and there have been many cases where offset projects have been found to have been implemented even without offset funding.

Additionally, there is currently no widely accepted criteria for a successful carbon offsetting project. The effectiveness of a project mostly depends on the claims of the vendor and most companies cannot independently verify their effectiveness As a result, choosing an offsetting project, at the time of writing, involves picking a project at random and relying on chance.

In most cases, companies do not care about the environmental impact and participate in a carbon offset project either to satisfy the regulatory requirements or as a publicity stunt, even though such publicity stunts are usually of no benefit to the companies’ public image.

Although genuine offsetting programs have their benefits, we should focus on lowering our energy consumption in the first place rather than trying to compensate for its damages afterward.

Of course, a good chunk of the energy we use involves general operations that are similar to any other company such as the energy used in our offices and transportation. Here, we are focused on the energy-saving strategies unique to software companies.

Programming languages and tools

Running a piece of code requires energy and there is a good chance the energy your code uses is produced using fossil fuels. The energy our code uses is spent, mostly, on processing power (CPU, GPU, etc.) and RAM. By lowering the CPU and memory usage, we can safely assume that we are using a lower amount of energy.

The amount of energy our code uses is highly dependent on the programming language we use. In general, interpreted languages (that rely on a runtime environment) and those with a garbage collector use a higher amount of RAM and CPU compared to those that are compiled and have no garbage collector respectively.

For example, a simple Python program uses a lot more CPU and RAM than a similar program written in C since the Python program has to spin up and use the Python interpreter while the C program does not have any such overhead.

Even though we can safely assume that languages like C or Rust are more energy-efficient than languages like Javascript and Python, quantification of this energy efficiency is quite difficult. The most reliable source of information is a single study performed by researchers at the University of Minho in Portugal. This study claims that, for example, Javascript and Python consume about 4 and 75 times more energy compared to C and Rust respectively.

Even though we can argue about the accuracy of this measurement, the overall theory has been proven right over and over again, both in small and large setups. For example, Discord rewrote one of its services, previously written in Go, using Rust. Even though Go is already a performant language, the Rust implementation managed to beat the Go implementation on every metric while lowering the overall memory and CPU usage.

Using programming languages and tools that do not rely on interpreters and runtimes is among the most reliable ways of reducing energy consumption that we have in the software industry.

On a small scale, the difference in energy consumption between most and least efficient languages is negligible. The energy difference between a small Python and Rust program running on your local machine is so small you probably cannot even measure it.

However, regardless of the negligible difference on a small scale, developers rely on the code already written by others while the code they write may end up as the foundation and technical debt for future projects. For example, in 1995 Brendan Eich created Javascript for Netspace 2, allegedly within 10 days, to allow some basic functionalities to websites. As of 2023, Javascript has turned into the most used programming language. Even though the chances are very low, the project you are working on right now might end up running on millions of devices.

By using a programming language, we are actively participating in the development and creation of its ecosystem and technical debt. By consciously choosing more efficient and performant languages that might be outside our comfort zone, we can ensure the lower carbon footprint of our code down the road.

However, it does not mean that we should all abandon Python, Ruby, and Javascript and start writing code in Rust or C. Every one of these tools was created for a reason and has its own use cases.

Besides the energy used by the final application, we developers also consume a non-trivial amount of energy developing the said application. In large-scale applications, the energy used during the development cycle is probably insignificant compared to the energy used to run the application. But in some cases, the energy you use writing and debugging a more complex language such as Rust may be so high that you would be better off using Python in the first place.

Also, due to the existing foundations and code bases, you are forced to use certain language in certain situations. If you are creating a website, you NEED to use Javascript since it is the primary language browsers run. Even if you use another programming language, your code will transpile into Javascript first. You can use languages like C or Rust on a website via Web Assembly but unless you have a specific use-case, you would be better off with Javascript.

Choose efficient hardware

While the software industry often has limited influence on the direction of hardware development, we have more power in our hands than we think.

Even though we are dependent on hardware vendors to run our code, the commercial success of a hardware product relies heavily on the available software it can run. Regardless of how good a piece of hardware is, it is utterly useless if there is no compatible quality software. So, when we decide to spend our time and energy writing software compatible with certain hardware, we are contributing to its financial success and the longevity of its design.

If we have the ability to choose (and deploy) the hardware necessary to run our code ourselves, we should make an effort to choose the most power-efficient products.

As an example, ARM processors are more power-efficient than their x86 counterparts. A research by two Brazilian researchers concluded that ARM-based SOCs, when used for HTTP and SQL servers, are 3 to 4 times more power-efficient than x86 systems. By choosing to write software for and support ARM-based SOCs rather than x86, we contribute to the financial success and further development of more efficient hardware.

Use AI when it's actually needed

The advancements in Artificial intelligence are among the biggest achievements of modern engineering, solving unique problems that conventional architectures cannot solve.

To produce an AI model, we have to analyze, prepare, and process an enormous set of data. The AI model will use the analysis of the dataset to complete a certain task when presented with a new piece of data that was never seen before. This process is known as training.

Training an AI model is a very energy-intensive process. According to research by the University of Massachusetts Amherst, training a single AI model can emit as much carbon as manufacturing and usage of five cars in their lifetimes.

So, before thinking about training a new AI model, we need to ask ourselves two questions. If the answer to both of these questions is no, only then we should start the process of training a custom AI model.

Can the problem at hand be solved effectively by conventional software? If yes, we should stick to the conventional software. A conventional solution is almost always more energy efficient than an AI model and even if the solution is less efficient than a trained AI model, the solution might never be used enough to surpass the energy used in training.

Can the problem be solved effectively by a pre-trained AI model? If yes, we should use a pre-trained model. Over the years, many developers have trained various AI models to solve common problems to an acceptable level. Using a pre-trained model will save the enormous amount of energy required to train a model from scratch.

The energy consumption of the training process is one of the important challenges that the tech industry, including us, has to solve in the upcoming years.

Have a sensible product development process

We, developers, are equally influenced by the hype of new technology as any other group of people. On the other hand, we all write software that we hope is used by a large number of people, either for economic or personal reasons.

Over the years, the software industry has tried to include or integrate some of the trending technologies in the products they develop. At some point, everything had an element of blockchain or tried to be decentralized. At another point, everything was being written in Javascript. Currently, everything has a generative AI or is “AI-powered” even though the addition of AI might be entirely pointless. In many cases, the addition of these trendy features does not translate to the success or longevity of the project. In fact, the effect might be the opposite.

The failure and delay of any project means that the entire energy and pollution created for the development of the product has been for nothing and the problems they were trying to solve, will remain unsolved. By having a more sensible product development rather than jamming trendy features to create a short-term buzz, we increase the viability of our products and in the long term, contribute to a reduction of our carbon footprint.

Your interests are aligned with the environment

Besides the obvious benefits of reducing our carbon footprint, it is actually in our own interest to do so. Most of the things we talked about above also save us money and/or improve our products.

Reducing the memory and CPU usage of our application allows us to run our code using cheaper hardware or cloud infrastructure. Having a more power-efficient system reduces our energy bills. Focusing on the core features of our products rather than trying to include unnecessary technologies, lowers the cost of development of our products and helps its long-term success.

Implementing some of these ideas is easier said than done but we, in the software industry, need to do our part to prevent and mitigate our negative influence on the environment and climate.