The AI boom shows no sign of abating: but at what cost on sustainability?
01 Nov 2023
7 mins read
"Remember, remember the 5th of November"
As someone who was brought up in the United Kingdom, it's approaching the time of year when I start to recall this line from a famous children's poem depicting Guy Fawkes’ terrorist attempt in London many centuries ago. Let me spin that around for a second and suggest that we "Remember, remember the 30th of November". You'd be forgiven for raising a questioning eyebrow, but bear with me for just another paragraph or so...
It was on the 30th of November 1487, when the first German Beer Purity Law (Reinheitsgebot) was promulgated in Munich by Albert IV, Duke of Bavaria stating beer should be brewed from only three ingredients – water, malt, and hops.
The first international association football game also took place on this date in 1872 in Glasgow, ending in a 0-0 draw between Scotland and England.
In 1928, Australian cricket legend Sir Donald Bradman made an inauspicious Test debut; scoring 18 and 1 against England in the 1st Ashes Test in Brisbane; resulting in him being dropped to 12th man for the following Test match. Bradman went on to average 99.84 in a career spanning 80 test match innings; a feat that is unlikely to ever be matched.
Now, despite my close affinity to beer, football, and cricket, you may understandably ask what any of this has to do with the topics at hand, AI and sustainability?
Cast your mind back to the 30th of November 2022 when Open AI released ChatGPT to the world. Guy Fawkes himself would have been proud of the global explosion we have witnessed since then and the profound impact this has had on everyday life for so many.
Sustainability is another topic that has quickly come to prominence in recent times. With so many different sources of information at our fingertips today, some more trustworthy than others, I often look to the innocence of youth to get an unbiased view of the world. It's not so long ago that reminding my own children to turn off any unused lights and electrical equipment was part of our daily ritual. The boot is now firmly on the other foot, however, as I'm constantly reminded by the same two children when I'm placing litter in the wrong recycling bin! If we recycle, it means we use less energy producing and moving new goods, thereby reducing carbon dioxide emissions. As my children now regularly say, every time we remember to re-use, repair or recycle, we are playing our own small part in helping to save our planet!
So…what else can we do to reduce our carbon footprint?
Sustainability policies are commonplace in most corporates today, with a huge selection of tools, resources, and frameworks now available to support businesses with measuring and reporting on their ESG initiatives. Not only that but we are seeing a growing trend of companies publishing content relating to their sustainability initiatives to improve transparency and understanding about their organisations and enhance brand perception. Apple, the well-known technology company, recently produced an elaborate video updating the world with progress on their environmental targets. Within a day of being uploaded, the video had attracted over half a million views on “X”, the platform formerly known as Twitter.
Switching to renewable energy and reducing usage is the first critical step in reducing carbon emissions, but for this to be effective, you first need to understand current usage, set targets, and be able to measure progress. Think of it like dieting... If you don't know how much you weigh to start with, or how many calories you are consuming each day, chances are you won't be very successful in hitting your targets.
At the Ethical Business Summit, recently hosted by ATC (Association of Translation Companies), one of the points that resonated with me most was the mantra “Progress, not perfection”, taken from a keynote speech by Dallas Consulting. Striving for perfection will ultimately end in failure but an accumulation of small goals and milestone achievements will collectively make a significant difference.
One change that can trigger a significant impact is to consider moving software platforms to the Public Cloud, using responsible providers focused on being carbon neutral. Cloud providers are inherently more efficient. After all, efficiency directly drives their profitability so it's in a Cloud Provider’s interest to invest in state-of-the-art technology and have monitoring in place to ensure hardware is being used efficiently and to optimise capacity. Statistics suggest 20% of racked servers are not being fully utilised, or worse still, not being used at all. Cloud landscape, as an outsourced expenditure, is under constant scrutiny by Procurement teams, but self-hosted and managed servers tend to fall under the radar.
How does AI fit into your sustainability initiative?
To say AI has taken off in recent times would be the understatement of the year. The potential for AI has no boundaries and the benefits are already being seen and used by many, but there is a hidden cost we should be aware of when it comes to sustainability that perhaps isn't immediately obvious.
Training Large Language Models (LLMs) requires huge computational power. The training of GPT-3 led to estimated emissions of around 500 tonnes of carbon dioxide. Just to put that into context, that's the equivalent of travelling 1.4 million miles in an average petrol passenger vehicle or charging 67 million smartphones.
There is a lot of talk right now of integrating AI and LLMs into popular search engines. By adding generative AI, estimates suggest this will require 4-5 times more computational power for every search. ChatGPT currently has around 13 million users a day whereas a typical mainstream search engine handles 500 million searches every day!
We should also consider the hardware needed to generate AI models. Most best-known generative AI models are processed by hyperscale providers, requiring thousands of servers with GPUs (Graphics Processing Units). GPUs have also taken on a new lifeform since the advent of crypto mining and machine learning, with demand outstripping production capacity for a period. GPUs are a great fit for machine learning as they can handle huge datasets and operate complex algorithms more efficiently and in parallel, but there is a trade-off here in the energy needed to power the GPUs.
Datacentres currently account for around 1-3% of greenhouse gas emissions worldwide, but estimates say they still won't be able to cope with increased generative AI demand because of integrating this with common search engines.
We are starting to see a change in mindset, and a growing shift in momentum, when it comes to building new datacentres. For example, at the ATC Ethical Business Summit, Christos Ellinides, the European Commission’s Directorate-General for Translation, commented that nine datacentres are being set-up to host supercomputers across Europe and running mostly on green energy. For example, one datacentre built in Luxembourg uses 100% green energy, produced from waste wood, to cool and power the supercomputer.
Why does AI use so much processing power?
The initial training of a generative AI model is the most intensive part of a process. Training a single LLM consumes the same energy and carbon footprint as a commercial trans-Atlantic flight. Once trained, using those models to get responses to user prompts requires less energy for each session but the number of sessions is ever growing.
We're also seeing a big spike in the number of open-source variants for LLMs as organisations battle to become leaders in this space. This open-source arena is brimming with innovation; presenting a choice of models and making them accessible to a wider audience. Looking at this from a sustainability perspective, however, this results in more organisations working to create or fine-tune their own models, which in turn results in higher energy demand. The decentralised nature of this trend means this responsibility is not only in the hands of a few companies. Any actions and initiatives to reduce the carbon footprint must therefore be made accessible to more organisations.
There is a lot of research ongoing to understand the best approach when defining an LLM strategy. We are starting to question what is the optimal size? Is it better to build from an LLM scratch or go with a commercial model? I see lots of similarities with what the language industry went through when translation memories first came to prominence all those years ago, and industry professionals were hunting the holy grail of perfecting their TM strategy. The onset of LLMs throws open that debate once more, as we can start to tap into the potential of generative AI to supplement, educate and enhance traditional translation resources like TMs, terminology, and machine translation.
What can AI tech companies do to improve the situation?
Research is well underway to understand how we reduce the amount of computing power needed to work on new data. What is already clear, however, is that it is cheaper to fine-tune, or give specialised training to an existing LLM, than it would be to create a new one.
There is also a counter view to consider where critics say ESG has become excessively directed. This, in turn, has resulted in an over-hyped fixation within organisations where ESG practices are adopted as a signal of virtues, sometimes at the expense of business focus and technological advancement.
It's vital that we find the right balance between value and accuracy. If we consider that the cost to improve the accuracy of a model by 1-2% is measured in metric tonnes of greenhouse gases, we must question if that justifies the gains made. In certain domains and for some content streams, striving for that extra 1-2% of accuracy is critically important, but in other cases the benefits may only be nominal.
And what about localization professionals? What can they do?
Choosing the right provider is the first step in the process. Many datacentres have committed to reducing carbon emissions but it's our responsibility to keep them accountable. Trados Enterprise relies on AWS for hosting. AWS is working towards powering with 100% renewable energy by 2030 but it is currently on track to meet this target by 2025. The company also aims to reach net zero carbon emissions by 2040.
Working with responsible providers can also help you meet your own carbon reduction targets so including this element as part of your selection criteria is key. Where moving to Public Cloud isn't an option, consider a responsible datacentre in preference to self-managed servers.
AI is changing the world in which we live, and I for one am not going to discourage anyone from jumping on board that journey. What we can all do, however, is ensure we are using AI responsibly and keeping in mind the environmental implications of our choices.
Take advantage of AI innovation but make sure it delivers real value. The market is becoming saturated with AI features, all claiming to be the next best thing - this is an unfortunate and inevitable side-effect when something grows with the speed and momentum we have seen with AI. When evaluating a new feature, ask yourself:
- Will this help me deliver cost savings?
- Will this facilitate quality improvements?
- Will this bring efficiency gains?
If this the answer to all those questions is "No", then I'd suggest that feature can almost certainly be discarded, ready to go up in flames when the 5th of November comes around.